Summer School of Data Science - Split '17

1. Introduction to Machine Learning with TensorFlow

This hands-on session serves as an introductory course for essential TensorFlow usage and basic machine learning with TensorFlow. This notebook is partly based on and follow the approach of chapter 6 of the book "Deep Learning" by Ian Goodfellow, Yoshua Bengio and Aaron Courville, available at: http://www.deeplearningbook.org/.

Other useful tutorials exist in the form of Jupyter notebooks, some of which are:

This notebook covers basic TensorFlow usage concepts, which are then applied to elementary machine learning models like linear and logistic regression, and finally a simple multilayer perceptron is built and trained using the established TensorFlow concepts.

Basic TensorFlow concepts

TensorFlow is an open source Python library which provides multiple APIs for buidling and evaluating computational graphs. These graphs can be used to represent any machine learning model, and TensorFlow provides methods for efficient optimization and evaluation of the models. The programmer's guide for TensorFlow can be found at https://www.tensorflow.org/programmers_guide/, and the full documentation is availale at https://www.tensorflow.org/api_docs/python/.

The import statement for TensorFlow programs is: import tensorflow as tf. This provides access to all TensorFlow APIs, classes, methods and symbols.


In [1]:
import tensorflow as tf

Tensor

The basic concept behind TensorFlow is the tensor - an n-dimensional array of a base datatype. In TensorFlow it is represented by the tf.Tensor object which will produce a value when evaluated. A tf.Tensor object has a shape (which defines the structure of the elements) and a data type, shared by all the elements in the Tensor. The main types of tensors are:

  • Constant
  • Variable
  • Placeholder

The tf.constant() method creates a constant tensor, populated with values of a data type, specified by arguments value, shape (optional), dtype (optional).


In [2]:
# create a TensorFlow constant tensor
t = tf.constant(5)
print(t)


Tensor("Const:0", shape=(), dtype=int32)

In [3]:
# create a TensorFlow constant of a specific data type and shape
t = tf.constant(7,shape=[2,3],dtype=tf.float32,name="const_tensor")
print(t)


Tensor("const_tensor:0", shape=(2, 3), dtype=float32)

However, any Tensor is only evaluated within a Session, which is the environment in which all tensors and operations are executed.


In [4]:
# create a TensorFlow session and evaluate the created constant
sess=tf.Session()
print(sess.run(t))


[[ 7.  7.  7.]
 [ 7.  7.  7.]]

Other very common and useful methods for creating tensors of constant value are tf.zeros() and tf.ones().


In [5]:
# create a tensor of any shape populated with zeros and check within the session
t = tf.zeros([3,2,2])
print(sess.run(t))


[[[ 0.  0.]
  [ 0.  0.]]

 [[ 0.  0.]
  [ 0.  0.]]

 [[ 0.  0.]
  [ 0.  0.]]]

In [6]:
# create a tensor of any shape populated with ones and check within the session
t = tf.ones([2,4])
print(sess.run(t))


[[ 1.  1.  1.  1.]
 [ 1.  1.  1.  1.]]

Tensors containing random values from various distribution can be created using a number of methods, with the most commonly used being tf.random_uniform() and tf.random_normal().


In [7]:
# create a random tensor containing values from a uniform distribution between 10 and 20
t = tf.random_uniform([3,4,2],minval=10,maxval=20)
print(t)
print(sess.run(t))


Tensor("random_uniform:0", shape=(3, 4, 2), dtype=float32)
[[[ 16.83423424  12.7082653 ]
  [ 17.87173653  12.05112267]
  [ 10.16734123  17.5232811 ]
  [ 11.0253067   17.1642189 ]]

 [[ 16.27947998  19.20880699]
  [ 13.15979576  17.601614  ]
  [ 14.10945988  18.1970787 ]
  [ 15.78850937  17.66868401]]

 [[ 13.94125748  15.40588951]
  [ 15.38523674  17.21222687]
  [ 17.30607224  11.45663071]
  [ 18.22287369  16.17138481]]]

Simple algebraic operations such as +,-,/,and * can be used with tensors in this form, or by calling tf.add(), tf.subtract(), tf.divide(), or tf.multiply(). These are all element-wise, and defined for tensors of equal shapes and data-types. Tensors can be cast into a specific data type by calling tf.cast().


In [8]:
# add a scalar to a tensor
a = tf.ones([3,2])
sess.run(a+3)


Out[8]:
array([[ 4.,  4.],
       [ 4.,  4.],
       [ 4.,  4.]], dtype=float32)

In [9]:
# subtract two tensors
a = tf.constant(4.,shape=[2,3])
print(a)
b = tf.random_normal(shape=[2,3])
print(b)
sess.run(a-b)


Tensor("Const_1:0", shape=(2, 3), dtype=float32)
Tensor("random_normal:0", shape=(2, 3), dtype=float32)
Out[9]:
array([[ 3.76239157,  5.0288825 ,  2.72073221],
       [ 4.68344164,  3.4789114 ,  2.83210611]], dtype=float32)

In [10]:
# divide two integer tensors
a = tf.constant(4,shape=[2,3])
b = tf.constant(7,shape=[2,3])
print(a/b)


Tensor("truediv:0", shape=(2, 3), dtype=float64)

Other very useful operations include:


In [11]:
# try out varied mathematical operations with various tensors
a = tf.exp(tf.random_normal(shape=[3,2]))
print(sess.run(a))
b = tf.matmul(a,tf.transpose(a))
print(sess.run(b))


[[ 0.53116924  2.12819934]
 [ 4.25643158  3.49265838]
 [ 1.98903203  0.74123877]]
[[  0.4170498    1.57493806   5.78904486]
 [  1.57493806   6.16942215  23.66168785]
 [  5.78904486  23.66168785  94.96226501]]

Placeholders and Variables

Placeholders and Vairables are special kinds of tensors which are the essential building blocks of more complex data and computation streams. These are the most commonly used types of tensors in TensorFlow.

A Placeholder is a tensor which acts like a "promise" to provide a value at the evaluation of the computational graph. Placeholders are mostly used as input points in the computational graph where data will be provided. It will produce an error when evaluated, unless the value is fed to the session.


In [12]:
# create a placeholder and feed it a value in a session
a = tf.placeholder(dtype=tf.float32)
b = tf.exp(a)
print(b)
print(sess.run(b,feed_dict={a:5}))

# create two placeholders and a tensor implementing matrix multiplication 
x1 = tf.placeholder(dtype=tf.float32)
x2 = tf.placeholder(dtype=tf.float32)
y = tf.matmul(x1,x2)
print(sess.run(y,{x2:[[1,2],[3,4]],x1:[[1,2],[3,4]]}))


Tensor("Exp_1:0", dtype=float32)
148.413
[[  7.  10.]
 [ 15.  22.]]

A Variable is a tensor which allows the addition of trainable parameters to the computational graph. Constants are intialized when created, as opposed to variables, which need to be initialized within the session (and the initialization procedure must be defined). Variables can be "manually" assigned a new value using tf.assign, and their state is kept within the session object. This is mostly used for model training, during which variables are changed within the optimization process.


In [13]:
# create a variable, initialize it, and assign a new value within a session
sess = tf.Session()
a = tf.Variable(5)
print(a)
sess.run(tf.global_variables_initializer())
sess.run(a)
sess.run(tf.assign(a,6))
print(sess.run(a))

sess.close()

sess = tf.Session()
sess.run(tf.global_variables_initializer())
print(sess.run(a))


<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
6
5

Linear regression in TensorFlow

Linear regression is one of the simplest and most commonly used regression models. The multivariate linear regression can be written as:

$$y = w^{T}x + b$$

where $y \in \mathbb{R}$ is the output, $w \in \mathbb{R}^{p}$ is a column vector containing $p$ weights for $p$ features in $x \in \mathbb{R}^{p}$, and $b \in \mathbb{R}$ is the bias. The parameters contained in $w$ and $b$ are also called coefficients and are trained by using a gradient descent algorithm.

Exercise:

Let us build a univariate linear regression model for a simple problem, using the previously introduced TensorFlow concepts:

  • The model input $x$ is a placeholder for data
  • The trainable model parameters $w$ and $b$ are defined as TensorFlow Variables
  • The model output $\hat{y}$ is a Tensor
  • The obesrved output $y$ is also a placeholder, where data will be provided for training purpose

In [14]:
#define placeholders for data
x = tf.placeholder(dtype=tf.float32,shape=[None])
y = tf.placeholder(dtype=tf.float32,shape=[None])

#define model parameters as variables
w = tf.Variable(tf.random_normal(shape=()))
b = tf.Variable(tf.random_normal([]))

#create a tensor which calculates the model output
y_model = w*x + b

To train a model built in TensorFlow, a loss function needs to be defined, most commonly as a reduction operation. An optimizer object needs to be defined, and the minimize() method called in order to update the variables defined within the model to minimize the selected loss function. When creating optimizer objects, choices about the learning rate have to be made - these, in combination with the number of training epochs, can greatly influence the model training process. With the approapriate learning rate, the optimization can quickly converge.


In [15]:
#define the loss function as the mean of all squared errors (MSE)
loss = tf.reduce_mean(tf.square(y_model-y))

#create a gradient descent optimizer
optimizer = tf.train.GradientDescentOptimizer(0.1)

#create a train operation
train = optimizer.minimize(loss)

#generate data to train the regression
import numpy as np
x_train = np.random.normal(size=10)
y_train = 5*x_train + 10 + np.random.normal(size=10)/10
print(x_train,y_train)

#initialize variables, run 100 epochs of training algorithm
sess.run(tf.global_variables_initializer())
for epoch in range(100):
    sess.run(train,{y:y_train,x:x_train})
    print('w:',sess.run(w),', b:',sess.run(b),', loss:',sess.run(loss,{y:y_train,x:x_train}))


[-1.24455598  0.54034035  0.67284074 -0.31131036 -1.54162209 -0.82222324
 -0.4128549  -1.02316814 -0.27822906  0.8220521 ] [  3.86083594  12.62098343  13.4825515    8.34129937   2.34952128
   5.802013     7.89253573   4.99876662   8.6800274   14.12025928]
w: 1.69321 , b: 1.86731 , loss: 55.0208
w: 1.59488 , b: 3.25869 , loss: 37.6063
w: 1.61127 , b: 4.36472 , loss: 26.5811
w: 1.70484 , b: 5.25072 , loss: 19.3754
w: 1.8483 , b: 5.96626 , loss: 14.503
w: 2.022 , b: 6.54901 , loss: 11.0944
w: 2.21188 , b: 7.02772 , loss: 8.63271
w: 2.40806 , b: 7.42435 , loss: 6.80452
w: 2.6037 , b: 7.75578 , loss: 5.41489
w: 2.79419 , b: 8.035 , loss: 4.33898
w: 2.97653 , b: 8.27208 , loss: 3.49415
w: 3.14889 , b: 8.47488 , loss: 2.82378
w: 3.3103 , b: 8.64952 , loss: 2.28779
w: 3.46034 , b: 8.80085 , loss: 1.8569
w: 3.59903 , b: 8.93271 , loss: 1.50917
w: 3.72665 , b: 9.04818 , loss: 1.22777
w: 3.84365 , b: 9.14975 , loss: 0.999633
w: 3.95062 , b: 9.23942 , loss: 0.814426
w: 4.04817 , b: 9.31886 , loss: 0.663935
w: 4.13699 , b: 9.38943 , loss: 0.541574
w: 4.21771 , b: 9.45227 , loss: 0.442044
w: 4.29099 , b: 9.50836 , loss: 0.361058
w: 4.35743 , b: 9.55851 , loss: 0.295148
w: 4.41764 , b: 9.60341 , loss: 0.241499
w: 4.47215 , b: 9.64366 , loss: 0.197827
w: 4.52147 , b: 9.67978 , loss: 0.162273
w: 4.56609 , b: 9.71224 , loss: 0.133327
w: 4.60642 , b: 9.74141 , loss: 0.109761
w: 4.64287 , b: 9.76765 , loss: 0.0905736
w: 4.67581 , b: 9.79126 , loss: 0.0749513
w: 4.70556 , b: 9.81252 , loss: 0.0622314
w: 4.73243 , b: 9.83168 , loss: 0.0518748
w: 4.75669 , b: 9.84893 , loss: 0.0434425
w: 4.7786 , b: 9.86448 , loss: 0.0365766
w: 4.79838 , b: 9.8785 , loss: 0.0309864
w: 4.81624 , b: 9.89114 , loss: 0.0264345
w: 4.83236 , b: 9.90253 , loss: 0.0227283
w: 4.8469 , b: 9.91281 , loss: 0.0197107
w: 4.86004 , b: 9.92208 , loss: 0.0172537
w: 4.87189 , b: 9.93044 , loss: 0.0152531
w: 4.88258 , b: 9.93798 , loss: 0.0136242
w: 4.89223 , b: 9.94478 , loss: 0.0122979
w: 4.90094 , b: 9.95092 , loss: 0.011218
w: 4.90881 , b: 9.95645 , loss: 0.0103387
w: 4.9159 , b: 9.96145 , loss: 0.0096228
w: 4.9223 , b: 9.96595 , loss: 0.00903985
w: 4.92808 , b: 9.97002 , loss: 0.00856524
w: 4.93329 , b: 9.97369 , loss: 0.0081788
w: 4.93799 , b: 9.977 , loss: 0.00786412
w: 4.94224 , b: 9.97998 , loss: 0.00760787
w: 4.94607 , b: 9.98268 , loss: 0.00739925
w: 4.94952 , b: 9.98511 , loss: 0.00722942
w: 4.95264 , b: 9.9873 , loss: 0.00709113
w: 4.95546 , b: 9.98928 , loss: 0.0069785
w: 4.958 , b: 9.99107 , loss: 0.00688684
w: 4.96029 , b: 9.99268 , loss: 0.00681217
w: 4.96236 , b: 9.99414 , loss: 0.00675137
w: 4.96422 , b: 9.99545 , loss: 0.00670192
w: 4.96591 , b: 9.99663 , loss: 0.0066616
w: 4.96743 , b: 9.9977 , loss: 0.00662878
w: 4.9688 , b: 9.99866 , loss: 0.00660207
w: 4.97003 , b: 9.99953 , loss: 0.0065803
w: 4.97115 , b: 10.0003 , loss: 0.00656261
w: 4.97216 , b: 10.001 , loss: 0.00654817
w: 4.97307 , b: 10.0017 , loss: 0.00653645
w: 4.97389 , b: 10.0022 , loss: 0.00652686
w: 4.97463 , b: 10.0028 , loss: 0.00651908
w: 4.97529 , b: 10.0032 , loss: 0.00651275
w: 4.9759 , b: 10.0037 , loss: 0.00650759
w: 4.97644 , b: 10.004 , loss: 0.00650339
w: 4.97693 , b: 10.0044 , loss: 0.00649998
w: 4.97737 , b: 10.0047 , loss: 0.00649719
w: 4.97777 , b: 10.005 , loss: 0.00649494
w: 4.97813 , b: 10.0052 , loss: 0.00649309
w: 4.97846 , b: 10.0055 , loss: 0.00649156
w: 4.97875 , b: 10.0057 , loss: 0.00649033
w: 4.97902 , b: 10.0059 , loss: 0.00648933
w: 4.97925 , b: 10.006 , loss: 0.00648852
w: 4.97947 , b: 10.0062 , loss: 0.00648788
w: 4.97966 , b: 10.0063 , loss: 0.00648734
w: 4.97984 , b: 10.0064 , loss: 0.00648688
w: 4.98 , b: 10.0065 , loss: 0.00648651
w: 4.98014 , b: 10.0066 , loss: 0.00648624
w: 4.98027 , b: 10.0067 , loss: 0.006486
w: 4.98039 , b: 10.0068 , loss: 0.00648583
w: 4.98049 , b: 10.0069 , loss: 0.00648565
w: 4.98059 , b: 10.007 , loss: 0.00648553
w: 4.98067 , b: 10.007 , loss: 0.00648543
w: 4.98075 , b: 10.0071 , loss: 0.00648536
w: 4.98082 , b: 10.0071 , loss: 0.00648529
w: 4.98088 , b: 10.0072 , loss: 0.00648523
w: 4.98094 , b: 10.0072 , loss: 0.00648517
w: 4.98099 , b: 10.0072 , loss: 0.00648514
w: 4.98104 , b: 10.0073 , loss: 0.00648511
w: 4.98108 , b: 10.0073 , loss: 0.00648508
w: 4.98112 , b: 10.0073 , loss: 0.00648506
w: 4.98115 , b: 10.0074 , loss: 0.00648506
w: 4.98118 , b: 10.0074 , loss: 0.00648504
w: 4.98121 , b: 10.0074 , loss: 0.00648502
w: 4.98123 , b: 10.0074 , loss: 0.00648502

Logistic Regression

Logistic regression is a very common and simple linear model for classification purposes, based on linear regression and the logistic function:

$$y = \frac{1}{1+e^{-(w^{T}x + b)}}$$

Due to the nature of the logistic function, it produces output values in the range $[0,1]$, thus providing a probability for each class given in the output. Similar to linear regression, the variables defined within the logistic regression model are parameters trainable by various optimization algorithms.

Let us build a logistic regression for the well-known XOR problem.


In [16]:
#generate XOR training data
import numpy as np
x_train = np.array([[0,0],[0,1],[1,0],[1,1]])
y_train = np.array([[0],[1],[1],[0]])

#import matplotlib for visualization
%matplotlib inline
import matplotlib.pyplot as plt

#logical indices of data where the outputs are 1 and 0
t = np.where(y_train==1)[0]
f = np.where(y_train==0)[0]

#scatter plot of the data
plt.scatter(x_train[t,0],x_train[t,1],c='b',marker='x',s=70)
plt.scatter(x_train[f,0],x_train[f,1],c='r',marker='o',s=70)


Out[16]:
<matplotlib.collections.PathCollection at 0x1210b0ac8>

Exercise:

  • The model input $x$ is a placeholder for a data
  • The trainable model parameters $w$ and $b$ are defined as TensorFlow Variables
  • The model output $\hat{y}$ is a Tensor
  • The obesrved output $y$ is also a placeholder, where output data will be provided in order to train the model

In [17]:
#define placeholders for the data
x = tf.placeholder(dtype=tf.float32,shape=[None,2])
y = tf.placeholder(dtype=tf.float32,shape=[None,1])

#define variables for the trainable parameters of the model
w = tf.Variable(tf.random_normal([2,1]),name="weights")
b = tf.Variable(tf.random_normal([1]), name="bias")

#create a tensor to calculate the model output
y_model = 1/(1+tf.exp(-(tf.matmul(x,w) + b)))

#define the loss function, create the optimizer and the training operation
loss = tf.reduce_mean(tf.square(y_model-y))
optimizer = tf.train.GradientDescentOptimizer(0.3)
train = optimizer.minimize(loss)

#train the model
sess.run(tf.global_variables_initializer())

for epoch in range(1000):
    sess.run(train,{x:x_train,y:y_train})
    print('w:',sess.run(w,{y:y_train,x:x_train}),', b:',sess.run(b,{y:y_train,x:x_train}),', loss:',sess.run(loss,{y:y_train,x:x_train}))


w: [[-1.02229416]
 [ 0.75775838]] , b: [ 0.93021137] , loss: 0.312112
w: [[-1.02493966]
 [ 0.73862422]] , b: [ 0.90882725] , loss: 0.309364
w: [[-1.027354  ]
 [ 0.71971798]] , b: [ 0.88780278] , loss: 0.306703
w: [[-1.02952194]
 [ 0.70107347]] , b: [ 0.86716753] , loss: 0.304133
w: [[-1.03142953]
 [ 0.68272233]] , b: [ 0.84694862] , loss: 0.301661
w: [[-1.03306508]
 [ 0.66469371]] , b: [ 0.82717061] , loss: 0.29929
w: [[-1.03441858]
 [ 0.64701402]] , b: [ 0.80785525] , loss: 0.297026
w: [[-1.03548217]
 [ 0.62970668]] , b: [ 0.78902119] , loss: 0.294868
w: [[-1.03625   ]
 [ 0.61279207]] , b: [ 0.77068412] , loss: 0.292819
w: [[-1.03671849]
 [ 0.59628731]] , b: [ 0.75285655] , loss: 0.290878
w: [[-1.03688562]
 [ 0.58020645]] , b: [ 0.7355479] , loss: 0.289044
w: [[-1.03675163]
 [ 0.56456041]] , b: [ 0.71876472] , loss: 0.287315
w: [[-1.03631818]
 [ 0.54935712]] , b: [ 0.70251071] , loss: 0.285688
w: [[-1.03558874]
 [ 0.53460163]] , b: [ 0.68678683] , loss: 0.284161
w: [[-1.03456795]
 [ 0.52029645]] , b: [ 0.67159158] , loss: 0.282729
w: [[-1.0332619 ]
 [ 0.50644159]] , b: [ 0.65692121] , loss: 0.281389
w: [[-1.0316776]
 [ 0.4930349]] , b: [ 0.64276999] , loss: 0.280135
w: [[-1.02982306]
 [ 0.4800722 ]] , b: [ 0.62913042] , loss: 0.278963
w: [[-1.02770698]
 [ 0.46754766]] , b: [ 0.61599338] , loss: 0.277869
w: [[-1.02533853]
 [ 0.45545387]] , b: [ 0.60334849] , loss: 0.276847
w: [[-1.02272749]
 [ 0.44378221]] , b: [ 0.5911842] , loss: 0.275894
w: [[-1.01988387]
 [ 0.43252292]] , b: [ 0.57948804] , loss: 0.275004
w: [[-1.01681793]
 [ 0.4216654 ]] , b: [ 0.56824684] , loss: 0.274172
w: [[-1.01353991]
 [ 0.41119835]] , b: [ 0.55744684] , loss: 0.273396
w: [[-1.01006007]
 [ 0.4011099 ]] , b: [ 0.5470739] , loss: 0.272669
w: [[-1.00638878]
 [ 0.39138788]] , b: [ 0.53711361] , loss: 0.27199
w: [[-1.00253618]
 [ 0.38201976]] , b: [ 0.52755135] , loss: 0.271353
w: [[-0.99851215]
 [ 0.37299293]] , b: [ 0.5183726] , loss: 0.270757
w: [[-0.99432647]
 [ 0.36429468]] , b: [ 0.50956267] , loss: 0.270196
w: [[-0.98998868]
 [ 0.35591239]] , b: [ 0.50110716] , loss: 0.269669
w: [[-0.98550802]
 [ 0.34783348]] , b: [ 0.49299178] , loss: 0.269172
w: [[-0.98089343]
 [ 0.34004563]] , b: [ 0.48520249] , loss: 0.268703
w: [[-0.97615355]
 [ 0.3325367 ]] , b: [ 0.4777256] , loss: 0.26826
w: [[-0.97129667]
 [ 0.32529482]] , b: [ 0.47054768] , loss: 0.267841
w: [[-0.96633077]
 [ 0.31830841]] , b: [ 0.46365568] , loss: 0.267443
w: [[-0.9612636 ]
 [ 0.31156623]] , b: [ 0.45703691] , loss: 0.267064
w: [[-0.95610243]
 [ 0.30505738]] , b: [ 0.45067912] , loss: 0.266704
w: [[-0.95085442]
 [ 0.29877129]] , b: [ 0.44457039] , loss: 0.26636
w: [[-0.94552624]
 [ 0.29269782]] , b: [ 0.43869928] , loss: 0.266031
w: [[-0.94012433]
 [ 0.28682715]] , b: [ 0.43305472] , loss: 0.265716
w: [[-0.93465483]
 [ 0.28114986]] , b: [ 0.42762607] , loss: 0.265413
w: [[-0.92912358]
 [ 0.27565691]] , b: [ 0.42240313] , loss: 0.265123
w: [[-0.92353618]
 [ 0.27033964]] , b: [ 0.41737604] , loss: 0.264843
w: [[-0.91789788]
 [ 0.26518974]] , b: [ 0.41253543] , loss: 0.264573
w: [[-0.9122138 ]
 [ 0.26019931]] , b: [ 0.40787223] , loss: 0.264312
w: [[-0.90648872]
 [ 0.25536075]] , b: [ 0.40337783] , loss: 0.264059
w: [[-0.90072715]
 [ 0.25066683]] , b: [ 0.39904398] , loss: 0.263814
w: [[-0.89493346]
 [ 0.24611066]] , b: [ 0.39486274] , loss: 0.263577
w: [[-0.88911176]
 [ 0.24168567]] , b: [ 0.39082664] , loss: 0.263346
w: [[-0.88326591]
 [ 0.23738563]] , b: [ 0.38692844] , loss: 0.263121
w: [[-0.87739962]
 [ 0.23320459]] , b: [ 0.38316128] , loss: 0.262902
w: [[-0.87151641]
 [ 0.2291369 ]] , b: [ 0.3795186] , loss: 0.262689
w: [[-0.86561954]
 [ 0.22517718]] , b: [ 0.37599418] , loss: 0.26248
w: [[-0.85971224]
 [ 0.22132036]] , b: [ 0.37258205] , loss: 0.262277
w: [[-0.85379744]
 [ 0.2175616 ]] , b: [ 0.36927658] , loss: 0.262078
w: [[-0.84787792]
 [ 0.21389632]] , b: [ 0.36607239] , loss: 0.261883
w: [[-0.84195638]
 [ 0.21032016]] , b: [ 0.36296433] , loss: 0.261692
w: [[-0.83603537]
 [ 0.20682901]] , b: [ 0.35994756] , loss: 0.261505
w: [[-0.83011723]
 [ 0.20341899]] , b: [ 0.35701743] , loss: 0.261322
w: [[-0.82420421]
 [ 0.20008639]] , b: [ 0.35416955] , loss: 0.261142
w: [[-0.8182984 ]
 [ 0.19682771]] , b: [ 0.35139972] , loss: 0.260966
w: [[-0.81240183]
 [ 0.19363967]] , b: [ 0.34870398] , loss: 0.260793
w: [[-0.80651641]
 [ 0.19051912]] , b: [ 0.34607857] , loss: 0.260623
w: [[-0.80064392]
 [ 0.18746312]] , b: [ 0.34351993] , loss: 0.260455
w: [[-0.79478598]
 [ 0.18446888]] , b: [ 0.34102464] , loss: 0.260291
w: [[-0.78894418]
 [ 0.18153375]] , b: [ 0.33858949] , loss: 0.26013
w: [[-0.78312004]
 [ 0.17865524]] , b: [ 0.33621144] , loss: 0.259971
w: [[-0.7773149 ]
 [ 0.17583099]] , b: [ 0.33388758] , loss: 0.259814
w: [[-0.77153009]
 [ 0.17305878]] , b: [ 0.33161515] , loss: 0.259661
w: [[-0.76576686]
 [ 0.17033651]] , b: [ 0.32939157] , loss: 0.259509
w: [[-0.76002634]
 [ 0.1676622 ]] , b: [ 0.32721439] , loss: 0.25936
w: [[-0.75430959]
 [ 0.16503398]] , b: [ 0.32508126] , loss: 0.259214
w: [[-0.74861765]
 [ 0.16245009]] , b: [ 0.32298997] , loss: 0.25907
w: [[-0.74295145]
 [ 0.15990885]] , b: [ 0.32093844] , loss: 0.258928
w: [[-0.73731184]
 [ 0.15740868]] , b: [ 0.3189247] , loss: 0.258788
w: [[-0.73169965]
 [ 0.15494813]] , b: [ 0.31694683] , loss: 0.25865
w: [[-0.72611564]
 [ 0.15252578]] , b: [ 0.31500313] , loss: 0.258515
w: [[-0.72056055]
 [ 0.15014032]] , b: [ 0.31309187] , loss: 0.258381
w: [[-0.71503496]
 [ 0.14779049]] , b: [ 0.3112115] , loss: 0.25825
w: [[-0.70953953]
 [ 0.14547512]] , b: [ 0.30936047] , loss: 0.25812
w: [[-0.7040748 ]
 [ 0.14319311]] , b: [ 0.30753741] , loss: 0.257993
w: [[-0.6986413 ]
 [ 0.14094342]] , b: [ 0.30574095] , loss: 0.257867
w: [[-0.69323951]
 [ 0.13872506]] , b: [ 0.30396983] , loss: 0.257743
w: [[-0.68786985]
 [ 0.13653709]] , b: [ 0.30222288] , loss: 0.257622
w: [[-0.68253273]
 [ 0.13437864]] , b: [ 0.30049893] , loss: 0.257502
w: [[-0.67722845]
 [ 0.13224889]] , b: [ 0.29879695] , loss: 0.257384
w: [[-0.67195743]
 [ 0.13014705]] , b: [ 0.29711592] , loss: 0.257267
w: [[-0.66671991]
 [ 0.1280724 ]] , b: [ 0.29545489] , loss: 0.257153
w: [[-0.66151613]
 [ 0.12602422]] , b: [ 0.29381296] , loss: 0.25704
w: [[-0.65634638]
 [ 0.12400185]] , b: [ 0.2921893] , loss: 0.256929
w: [[-0.65121084]
 [ 0.1220047 ]] , b: [ 0.2905831] , loss: 0.256819
w: [[-0.6461097 ]
 [ 0.12003215]] , b: [ 0.2889936] , loss: 0.256712
w: [[-0.64104307]
 [ 0.11808366]] , b: [ 0.28742009] , loss: 0.256606
w: [[-0.63601112]
 [ 0.11615871]] , b: [ 0.28586191] , loss: 0.256501
w: [[-0.63101399]
 [ 0.11425679]] , b: [ 0.28431842] , loss: 0.256398
w: [[-0.62605172]
 [ 0.11237744]] , b: [ 0.28278905] , loss: 0.256297
w: [[-0.62112439]
 [ 0.11052021]] , b: [ 0.28127322] , loss: 0.256197
w: [[-0.61623204]
 [ 0.10868468]] , b: [ 0.2797704] , loss: 0.256099
w: [[-0.61137474]
 [ 0.10687045]] , b: [ 0.27828011] , loss: 0.256003
w: [[-0.60655248]
 [ 0.10507713]] , b: [ 0.27680185] , loss: 0.255908
w: [[-0.60176528]
 [ 0.10330438]] , b: [ 0.27533522] , loss: 0.255814
w: [[-0.59701312]
 [ 0.10155183]] , b: [ 0.27387977] , loss: 0.255722
w: [[-0.59229594]
 [ 0.09981917]] , b: [ 0.27243513] , loss: 0.255631
w: [[-0.58761376]
 [ 0.09810608]] , b: [ 0.27100089] , loss: 0.255541
w: [[-0.58296651]
 [ 0.09641226]] , b: [ 0.26957676] , loss: 0.255453
w: [[-0.57835412]
 [ 0.09473746]] , b: [ 0.2681624] , loss: 0.255367
w: [[-0.57377648]
 [ 0.09308136]] , b: [ 0.26675749] , loss: 0.255282
w: [[-0.56923354]
 [ 0.09144375]] , b: [ 0.26536176] , loss: 0.255198
w: [[-0.56472522]
 [ 0.08982435]] , b: [ 0.26397491] , loss: 0.255115
w: [[-0.56025136]
 [ 0.08822294]] , b: [ 0.2625967] , loss: 0.255034
w: [[-0.55581188]
 [ 0.08663929]] , b: [ 0.26122689] , loss: 0.254954
w: [[-0.55140668]
 [ 0.08507317]] , b: [ 0.25986528] , loss: 0.254875
w: [[-0.54703557]
 [ 0.08352439]] , b: [ 0.25851163] , loss: 0.254798
w: [[-0.54269844]
 [ 0.08199275]] , b: [ 0.25716576] , loss: 0.254721
w: [[-0.53839517]
 [ 0.08047803]] , b: [ 0.25582749] , loss: 0.254646
w: [[-0.53412563]
 [ 0.07898007]] , b: [ 0.2544966] , loss: 0.254573
w: [[-0.52988958]
 [ 0.07749867]] , b: [ 0.25317299] , loss: 0.2545
w: [[-0.52568692]
 [ 0.07603367]] , b: [ 0.25185645] , loss: 0.254428
w: [[-0.52151752]
 [ 0.07458489]] , b: [ 0.25054687] , loss: 0.254358
w: [[-0.51738113]
 [ 0.07315217]] , b: [ 0.24924409] , loss: 0.254289
w: [[-0.51327759]
 [ 0.07173534]] , b: [ 0.24794801] , loss: 0.254221
w: [[-0.50920677]
 [ 0.07033426]] , b: [ 0.24665847] , loss: 0.254154
w: [[-0.50516844]
 [ 0.06894878]] , b: [ 0.24537539] , loss: 0.254088
w: [[-0.50116241]
 [ 0.06757873]] , b: [ 0.24409866] , loss: 0.254023
w: [[-0.49718854]
 [ 0.06622399]] , b: [ 0.24282818] , loss: 0.253959
w: [[-0.49324661]
 [ 0.06488441]] , b: [ 0.24156384] , loss: 0.253896
w: [[-0.48933643]
 [ 0.06355985]] , b: [ 0.24030557] , loss: 0.253834
w: [[-0.48545781]
 [ 0.06225019]] , b: [ 0.23905329] , loss: 0.253773
w: [[-0.48161054]
 [ 0.06095528]] , b: [ 0.23780692] , loss: 0.253713
w: [[-0.47779441]
 [ 0.05967499]] , b: [ 0.23656638] , loss: 0.253655
w: [[-0.47400922]
 [ 0.0584092 ]] , b: [ 0.23533161] , loss: 0.253597
w: [[-0.47025478]
 [ 0.05715779]] , b: [ 0.23410255] , loss: 0.25354
w: [[-0.46653086]
 [ 0.05592063]] , b: [ 0.23287913] , loss: 0.253483
w: [[-0.46283728]
 [ 0.0546976 ]] , b: [ 0.23166129] , loss: 0.253428
w: [[-0.45917383]
 [ 0.05348857]] , b: [ 0.23044899] , loss: 0.253374
w: [[-0.45554027]
 [ 0.05229344]] , b: [ 0.22924218] , loss: 0.253321
w: [[-0.45193642]
 [ 0.05111208]] , b: [ 0.2280408] , loss: 0.253268
w: [[-0.44836205]
 [ 0.04994437]] , b: [ 0.22684482] , loss: 0.253216
w: [[-0.44481698]
 [ 0.04879022]] , b: [ 0.2256542] , loss: 0.253165
w: [[-0.44130096]
 [ 0.0476495 ]] , b: [ 0.2244689] , loss: 0.253115
w: [[-0.43781379]
 [ 0.04652209]] , b: [ 0.22328888] , loss: 0.253066
w: [[-0.43435526]
 [ 0.04540789]] , b: [ 0.22211409] , loss: 0.253018
w: [[-0.43092516]
 [ 0.04430679]] , b: [ 0.22094451] , loss: 0.25297
w: [[-0.42752329]
 [ 0.04321868]] , b: [ 0.21978012] , loss: 0.252923
w: [[-0.42414942]
 [ 0.04214346]] , b: [ 0.21862088] , loss: 0.252877
w: [[-0.42080334]
 [ 0.041081  ]] , b: [ 0.21746676] , loss: 0.252832
w: [[-0.41748485]
 [ 0.04003122]] , b: [ 0.21631774] , loss: 0.252787
w: [[-0.41419372]
 [ 0.03899401]] , b: [ 0.21517381] , loss: 0.252743
w: [[-0.41092974]
 [ 0.03796924]] , b: [ 0.21403493] , loss: 0.2527
w: [[-0.4076927 ]
 [ 0.03695684]] , b: [ 0.21290107] , loss: 0.252658
w: [[-0.40448242]
 [ 0.03595668]] , b: [ 0.21177222] , loss: 0.252616
w: [[-0.40129867]
 [ 0.03496866]] , b: [ 0.21064836] , loss: 0.252575
w: [[-0.39814124]
 [ 0.03399269]] , b: [ 0.20952947] , loss: 0.252535
w: [[-0.39500991]
 [ 0.03302865]] , b: [ 0.20841554] , loss: 0.252495
w: [[-0.39190447]
 [ 0.03207646]] , b: [ 0.20730653] , loss: 0.252456
w: [[-0.38882473]
 [ 0.031136  ]] , b: [ 0.20620245] , loss: 0.252417
w: [[-0.38577047]
 [ 0.03020719]] , b: [ 0.20510326] , loss: 0.252379
w: [[-0.38274151]
 [ 0.0292899 ]] , b: [ 0.20400895] , loss: 0.252342
w: [[-0.37973762]
 [ 0.02838406]] , b: [ 0.20291951] , loss: 0.252306
w: [[-0.37675861]
 [ 0.02748955]] , b: [ 0.20183493] , loss: 0.25227
w: [[-0.37380427]
 [ 0.02660629]] , b: [ 0.20075519] , loss: 0.252234
w: [[-0.3708744 ]
 [ 0.02573416]] , b: [ 0.19968028] , loss: 0.252199
w: [[-0.3679688 ]
 [ 0.02487307]] , b: [ 0.19861019] , loss: 0.252165
w: [[-0.36508727]
 [ 0.02402294]] , b: [ 0.19754489] , loss: 0.252131
w: [[-0.36222962]
 [ 0.02318365]] , b: [ 0.19648437] , loss: 0.252098
w: [[-0.35939565]
 [ 0.02235511]] , b: [ 0.19542864] , loss: 0.252065
w: [[-0.35658514]
 [ 0.02153722]] , b: [ 0.19437768] , loss: 0.252033
w: [[-0.35379794]
 [ 0.0207299 ]] , b: [ 0.19333147] , loss: 0.252002
w: [[-0.35103384]
 [ 0.01993304]] , b: [ 0.19228999] , loss: 0.251971
w: [[-0.34829262]
 [ 0.01914655]] , b: [ 0.19125324] , loss: 0.25194
w: [[-0.34557411]
 [ 0.01837033]] , b: [ 0.19022122] , loss: 0.25191
w: [[-0.3428781 ]
 [ 0.01760429]] , b: [ 0.1891939] , loss: 0.25188
w: [[-0.34020445]
 [ 0.01684834]] , b: [ 0.18817128] , loss: 0.251851
w: [[-0.33755293]
 [ 0.01610238]] , b: [ 0.18715334] , loss: 0.251823
w: [[-0.33492336]
 [ 0.01536633]] , b: [ 0.18614008] , loss: 0.251795
w: [[-0.33231556]
 [ 0.01464008]] , b: [ 0.18513148] , loss: 0.251767
w: [[-0.32972935]
 [ 0.01392356]] , b: [ 0.18412752] , loss: 0.25174
w: [[-0.32716453]
 [ 0.01321666]] , b: [ 0.18312822] , loss: 0.251713
w: [[-0.32462093]
 [ 0.01251929]] , b: [ 0.18213354] , loss: 0.251686
w: [[-0.3220984 ]
 [ 0.01183136]] , b: [ 0.18114346] , loss: 0.25166
w: [[-0.31959674]
 [ 0.01115278]] , b: [ 0.180158] , loss: 0.251635
w: [[-0.31711575]
 [ 0.01048348]] , b: [ 0.17917714] , loss: 0.25161
w: [[-0.31465527]
 [ 0.00982334]] , b: [ 0.17820086] , loss: 0.251585
w: [[-0.31221512]
 [ 0.0091723 ]] , b: [ 0.17722915] , loss: 0.251561
w: [[-0.30979514]
 [ 0.00853025]] , b: [ 0.17626201] , loss: 0.251537
w: [[-0.30739513]
 [ 0.00789711]] , b: [ 0.17529942] , loss: 0.251513
w: [[-0.30501494]
 [ 0.0072728 ]] , b: [ 0.17434138] , loss: 0.25149
w: [[-0.30265442]
 [ 0.00665721]] , b: [ 0.17338786] , loss: 0.251467
w: [[-0.30031335]
 [ 0.00605028]] , b: [ 0.17243886] , loss: 0.251445
w: [[-0.2979916 ]
 [ 0.00545191]] , b: [ 0.17149436] , loss: 0.251423
w: [[-0.29568902]
 [ 0.00486202]] , b: [ 0.17055437] , loss: 0.251401
w: [[-0.29340541]
 [ 0.00428053]] , b: [ 0.16961886] , loss: 0.25138
w: [[-0.29114065]
 [ 0.00370735]] , b: [ 0.16868782] , loss: 0.251359
w: [[-0.2888945 ]
 [ 0.00314239]] , b: [ 0.16776125] , loss: 0.251338
w: [[-0.28666687]
 [ 0.00258558]] , b: [ 0.16683912] , loss: 0.251318
w: [[-0.28445759]
 [ 0.00203682]] , b: [ 0.16592143] , loss: 0.251298
w: [[-0.2822665 ]
 [ 0.00149604]] , b: [ 0.16500817] , loss: 0.251278
w: [[-0.28009343]
 [ 0.00096316]] , b: [ 0.16409932] , loss: 0.251259
w: [[-0.27793822]
 [ 0.0004381 ]] , b: [ 0.16319488] , loss: 0.25124
w: [[ -2.75800705e-01]
 [ -7.92315695e-05]] , b: [ 0.16229482] , loss: 0.251221
w: [[-0.27368078]
 [-0.0005889 ]] , b: [ 0.16139914] , loss: 0.251203
w: [[-0.27157825]
 [-0.00109101]] , b: [ 0.16050783] , loss: 0.251185
w: [[-0.26949298]
 [-0.00158561]] , b: [ 0.15962087] , loss: 0.251167
w: [[-0.26742482]
 [-0.00207279]] , b: [ 0.15873826] , loss: 0.251149
w: [[-0.26537362]
 [-0.00255264]] , b: [ 0.15785997] , loss: 0.251132
w: [[-0.26333925]
 [-0.00302522]] , b: [ 0.15698598] , loss: 0.251115
w: [[-0.26132154]
 [-0.00349062]] , b: [ 0.15611631] , loss: 0.251098
w: [[-0.25932038]
 [-0.0039489 ]] , b: [ 0.15525092] , loss: 0.251082
w: [[-0.2573356 ]
 [-0.00440016]] , b: [ 0.15438981] , loss: 0.251065
w: [[-0.25536704]
 [-0.00484445]] , b: [ 0.15353297] , loss: 0.251049
w: [[-0.2534146 ]
 [-0.00528186]] , b: [ 0.15268038] , loss: 0.251034
w: [[-0.25147811]
 [-0.00571247]] , b: [ 0.15183203] , loss: 0.251018
w: [[-0.24955745]
 [-0.00613633]] , b: [ 0.15098789] , loss: 0.251003
w: [[-0.24765249]
 [-0.00655354]] , b: [ 0.15014797] , loss: 0.250988
w: [[-0.24576306]
 [-0.00696415]] , b: [ 0.14931226] , loss: 0.250973
w: [[-0.24388906]
 [-0.00736824]] , b: [ 0.14848073] , loss: 0.250959
w: [[-0.24203035]
 [-0.00776588]] , b: [ 0.14765337] , loss: 0.250945
w: [[-0.2401868 ]
 [-0.00815715]] , b: [ 0.14683016] , loss: 0.25093
w: [[-0.23835826]
 [-0.00854211]] , b: [ 0.14601108] , loss: 0.250917
w: [[-0.23654461]
 [-0.00892084]] , b: [ 0.14519615] , loss: 0.250903
w: [[-0.23474573]
 [-0.00929339]] , b: [ 0.14438534] , loss: 0.25089
w: [[-0.23296148]
 [-0.00965985]] , b: [ 0.14357862] , loss: 0.250877
w: [[-0.23119174]
 [-0.01002027]] , b: [ 0.142776] , loss: 0.250864
w: [[-0.22943638]
 [-0.01037474]] , b: [ 0.14197744] , loss: 0.250851
w: [[-0.22769529]
 [-0.0107233 ]] , b: [ 0.14118296] , loss: 0.250838
w: [[-0.22596833]
 [-0.01106604]] , b: [ 0.14039251] , loss: 0.250826
w: [[-0.22425538]
 [-0.01140301]] , b: [ 0.1396061] , loss: 0.250814
w: [[-0.22255632]
 [-0.01173428]] , b: [ 0.1388237] , loss: 0.250802
w: [[-0.22087105]
 [-0.01205991]] , b: [ 0.13804531] , loss: 0.25079
w: [[-0.21919942]
 [-0.01237997]] , b: [ 0.13727091] , loss: 0.250778
w: [[-0.21754132]
 [-0.01269452]] , b: [ 0.13650048] , loss: 0.250767
w: [[-0.21589665]
 [-0.01300364]] , b: [ 0.13573401] , loss: 0.250756
w: [[-0.21426529]
 [-0.01330737]] , b: [ 0.13497147] , loss: 0.250745
w: [[-0.21264711]
 [-0.01360578]] , b: [ 0.13421287] , loss: 0.250734
w: [[-0.211042  ]
 [-0.01389892]] , b: [ 0.1334582] , loss: 0.250723
w: [[-0.20944986]
 [-0.01418687]] , b: [ 0.13270742] , loss: 0.250712
w: [[-0.20787057]
 [-0.01446969]] , b: [ 0.13196053] , loss: 0.250702
w: [[-0.20630403]
 [-0.01474743]] , b: [ 0.13121751] , loss: 0.250692
w: [[-0.20475011]
 [-0.01502015]] , b: [ 0.13047835] , loss: 0.250682
w: [[-0.20320871]
 [-0.0152879 ]] , b: [ 0.12974304] , loss: 0.250672
w: [[-0.20167972]
 [-0.01555076]] , b: [ 0.12901156] , loss: 0.250662
w: [[-0.20016305]
 [-0.01580878]] , b: [ 0.12828387] , loss: 0.250652
w: [[-0.19865857]
 [-0.016062  ]] , b: [ 0.12756] , loss: 0.250643
w: [[-0.19716619]
 [-0.0163105 ]] , b: [ 0.12683992] , loss: 0.250634
w: [[-0.1956858 ]
 [-0.01655432]] , b: [ 0.12612359] , loss: 0.250624
w: [[-0.19421731]
 [-0.01679353]] , b: [ 0.12541103] , loss: 0.250615
w: [[-0.1927606 ]
 [-0.01702817]] , b: [ 0.12470221] , loss: 0.250606
w: [[-0.19131558]
 [-0.0172583 ]] , b: [ 0.12399711] , loss: 0.250598
w: [[-0.18988213]
 [-0.01748398]] , b: [ 0.12329571] , loss: 0.250589
w: [[-0.18846017]
 [-0.01770525]] , b: [ 0.12259801] , loss: 0.250581
w: [[-0.1870496 ]
 [-0.01792218]] , b: [ 0.12190399] , loss: 0.250572
w: [[-0.18565032]
 [-0.01813481]] , b: [ 0.12121364] , loss: 0.250564
w: [[-0.18426223]
 [-0.01834321]] , b: [ 0.12052692] , loss: 0.250556
w: [[-0.18288524]
 [-0.01854741]] , b: [ 0.11984384] , loss: 0.250548
w: [[-0.18151926]
 [-0.01874747]] , b: [ 0.11916438] , loss: 0.25054
w: [[-0.18016419]
 [-0.01894344]] , b: [ 0.11848853] , loss: 0.250532
w: [[-0.17881992]
 [-0.01913537]] , b: [ 0.11781627] , loss: 0.250525
w: [[-0.17748639]
 [-0.01932332]] , b: [ 0.11714757] , loss: 0.250517
w: [[-0.17616348]
 [-0.01950732]] , b: [ 0.11648244] , loss: 0.25051
w: [[-0.17485112]
 [-0.01968742]] , b: [ 0.11582085] , loss: 0.250502
w: [[-0.17354921]
 [-0.01986369]] , b: [ 0.11516279] , loss: 0.250495
w: [[-0.17225765]
 [-0.02003615]] , b: [ 0.11450825] , loss: 0.250488
w: [[-0.17097637]
 [-0.02020487]] , b: [ 0.1138572] , loss: 0.250481
w: [[-0.16970527]
 [-0.02036988]] , b: [ 0.11320964] , loss: 0.250474
w: [[-0.16844428]
 [-0.02053123]] , b: [ 0.11256554] , loss: 0.250468
w: [[-0.16719329]
 [-0.02068898]] , b: [ 0.11192489] , loss: 0.250461
w: [[-0.16595224]
 [-0.02084316]] , b: [ 0.11128768] , loss: 0.250454
w: [[-0.16472103]
 [-0.02099383]] , b: [ 0.1106539] , loss: 0.250448
w: [[-0.16349958]
 [-0.02114101]] , b: [ 0.11002353] , loss: 0.250442
w: [[-0.1622878 ]
 [-0.02128476]] , b: [ 0.10939655] , loss: 0.250435
w: [[-0.16108562]
 [-0.02142513]] , b: [ 0.10877294] , loss: 0.250429
w: [[-0.15989296]
 [-0.02156215]] , b: [ 0.1081527] , loss: 0.250423
w: [[-0.15870972]
 [-0.02169587]] , b: [ 0.1075358] , loss: 0.250417
w: [[-0.15753584]
 [-0.02182632]] , b: [ 0.10692224] , loss: 0.250411
w: [[-0.15637124]
 [-0.02195357]] , b: [ 0.10631199] , loss: 0.250406
w: [[-0.15521583]
 [-0.02207763]] , b: [ 0.10570505] , loss: 0.2504
w: [[-0.15406954]
 [-0.02219856]] , b: [ 0.10510139] , loss: 0.250394
w: [[-0.15293229]
 [-0.02231639]] , b: [ 0.10450101] , loss: 0.250389
w: [[-0.15180399]
 [-0.02243117]] , b: [ 0.10390388] , loss: 0.250383
w: [[-0.15068458]
 [-0.02254293]] , b: [ 0.10331] , loss: 0.250378
w: [[-0.149574  ]
 [-0.02265172]] , b: [ 0.10271934] , loss: 0.250373
w: [[-0.14847215]
 [-0.02275757]] , b: [ 0.1021319] , loss: 0.250367
w: [[-0.14737895]
 [-0.02286052]] , b: [ 0.10154767] , loss: 0.250362
w: [[-0.14629436]
 [-0.02296061]] , b: [ 0.10096661] , loss: 0.250357
w: [[-0.14521828]
 [-0.02305789]] , b: [ 0.10038872] , loss: 0.250352
w: [[-0.14415064]
 [-0.02315237]] , b: [ 0.09981399] , loss: 0.250347
w: [[-0.1430914]
 [-0.0232441]] , b: [ 0.09924239] , loss: 0.250342
w: [[-0.14204045]
 [-0.02333312]] , b: [ 0.09867392] , loss: 0.250338
w: [[-0.14099772]
 [-0.02341947]] , b: [ 0.09810856] , loss: 0.250333
w: [[-0.13996318]
 [-0.02350318]] , b: [ 0.09754629] , loss: 0.250328
w: [[-0.13893673]
 [-0.02358428]] , b: [ 0.09698711] , loss: 0.250324
w: [[-0.13791831]
 [-0.02366281]] , b: [ 0.09643099] , loss: 0.250319
w: [[-0.13690785]
 [-0.0237388 ]] , b: [ 0.09587793] , loss: 0.250315
w: [[-0.13590528]
 [-0.0238123 ]] , b: [ 0.09532791] , loss: 0.250311
w: [[-0.13491055]
 [-0.02388333]] , b: [ 0.09478089] , loss: 0.250306
w: [[-0.13392359]
 [-0.02395193]] , b: [ 0.0942369] , loss: 0.250302
w: [[-0.13294433]
 [-0.02401812]] , b: [ 0.09369589] , loss: 0.250298
w: [[-0.1319727 ]
 [-0.02408195]] , b: [ 0.09315786] , loss: 0.250294
w: [[-0.13100864]
 [-0.02414344]] , b: [ 0.0926228] , loss: 0.25029
w: [[-0.13005209]
 [-0.02420264]] , b: [ 0.09209069] , loss: 0.250286
w: [[-0.12910298]
 [-0.02425955]] , b: [ 0.09156152] , loss: 0.250282
w: [[-0.12816125]
 [-0.02431423]] , b: [ 0.09103527] , loss: 0.250278
w: [[-0.12722684]
 [-0.0243667 ]] , b: [ 0.09051193] , loss: 0.250274
w: [[-0.12629971]
 [-0.024417  ]] , b: [ 0.08999147] , loss: 0.25027
w: [[-0.12537977]
 [-0.02446515]] , b: [ 0.0894739] , loss: 0.250267
w: [[-0.12446697]
 [-0.02451118]] , b: [ 0.08895919] , loss: 0.250263
w: [[-0.12356126]
 [-0.02455512]] , b: [ 0.08844733] , loss: 0.250259
w: [[-0.12266256]
 [-0.024597  ]] , b: [ 0.08793832] , loss: 0.250256
w: [[-0.12177083]
 [-0.02463685]] , b: [ 0.08743213] , loss: 0.250252
w: [[-0.12088601]
 [-0.02467471]] , b: [ 0.08692874] , loss: 0.250249
w: [[-0.12000804]
 [-0.0247106 ]] , b: [ 0.08642815] , loss: 0.250245
w: [[-0.11913686]
 [-0.02474453]] , b: [ 0.08593035] , loss: 0.250242
w: [[-0.1182724 ]
 [-0.02477655]] , b: [ 0.08543531] , loss: 0.250239
w: [[-0.11741464]
 [-0.02480669]] , b: [ 0.08494302] , loss: 0.250236
w: [[-0.1165635 ]
 [-0.02483496]] , b: [ 0.08445346] , loss: 0.250232
w: [[-0.11571893]
 [-0.0248614 ]] , b: [ 0.08396664] , loss: 0.250229
w: [[-0.11488087]
 [-0.02488603]] , b: [ 0.08348254] , loss: 0.250226
w: [[-0.11404929]
 [-0.02490887]] , b: [ 0.08300113] , loss: 0.250223
w: [[-0.1132241 ]
 [-0.02492996]] , b: [ 0.08252241] , loss: 0.25022
w: [[-0.11240527]
 [-0.02494933]] , b: [ 0.08204635] , loss: 0.250217
w: [[-0.11159275]
 [-0.02496699]] , b: [ 0.08157296] , loss: 0.250214
w: [[-0.11078648]
 [-0.02498296]] , b: [ 0.08110221] , loss: 0.250211
w: [[-0.1099864 ]
 [-0.02499729]] , b: [ 0.08063409] , loss: 0.250208
w: [[-0.10919248]
 [-0.02500998]] , b: [ 0.08016859] , loss: 0.250205
w: [[-0.10840464]
 [-0.02502107]] , b: [ 0.0797057] , loss: 0.250203
w: [[-0.10762286]
 [-0.02503058]] , b: [ 0.0792454] , loss: 0.2502
w: [[-0.10684708]
 [-0.02503852]] , b: [ 0.07878768] , loss: 0.250197
w: [[-0.10607724]
 [-0.02504493]] , b: [ 0.07833254] , loss: 0.250195
w: [[-0.10531331]
 [-0.02504984]] , b: [ 0.07787993] , loss: 0.250192
w: [[-0.10455523]
 [-0.02505325]] , b: [ 0.07742987] , loss: 0.250189
w: [[-0.10380295]
 [-0.0250552 ]] , b: [ 0.07698233] , loss: 0.250187
w: [[-0.10305642]
 [-0.0250557 ]] , b: [ 0.0765373] , loss: 0.250184
w: [[-0.10231562]
 [-0.02505478]] , b: [ 0.07609478] , loss: 0.250182
w: [[-0.10158047]
 [-0.02505246]] , b: [ 0.07565474] , loss: 0.250179
w: [[-0.10085093]
 [-0.02504877]] , b: [ 0.07521718] , loss: 0.250177
w: [[-0.10012697]
 [-0.02504372]] , b: [ 0.07478207] , loss: 0.250175
w: [[-0.09940854]
 [-0.02503733]] , b: [ 0.07434941] , loss: 0.250172
w: [[-0.09869559]
 [-0.02502963]] , b: [ 0.07391918] , loss: 0.25017
w: [[-0.09798807]
 [-0.02502063]] , b: [ 0.07349139] , loss: 0.250168
w: [[-0.09728594]
 [-0.02501037]] , b: [ 0.073066] , loss: 0.250166
w: [[-0.09658916]
 [-0.02499885]] , b: [ 0.07264302] , loss: 0.250163
w: [[-0.09589768]
 [-0.0249861 ]] , b: [ 0.07222242] , loss: 0.250161
w: [[-0.09521146]
 [-0.02497213]] , b: [ 0.0718042] , loss: 0.250159
w: [[-0.09453046]
 [-0.02495697]] , b: [ 0.07138833] , loss: 0.250157
w: [[-0.09385464]
 [-0.02494063]] , b: [ 0.07097482] , loss: 0.250155
w: [[-0.09318394]
 [-0.02492314]] , b: [ 0.07056364] , loss: 0.250153
w: [[-0.09251834]
 [-0.02490451]] , b: [ 0.07015478] , loss: 0.250151
w: [[-0.09185778]
 [-0.02488476]] , b: [ 0.06974824] , loss: 0.250149
w: [[-0.09120223]
 [-0.02486391]] , b: [ 0.06934399] , loss: 0.250147
w: [[-0.09055165]
 [-0.02484198]] , b: [ 0.06894203] , loss: 0.250145
w: [[-0.089906  ]
 [-0.02481898]] , b: [ 0.06854235] , loss: 0.250143
w: [[-0.08926524]
 [-0.02479493]] , b: [ 0.06814493] , loss: 0.250141
w: [[-0.08862933]
 [-0.02476986]] , b: [ 0.06774977] , loss: 0.250139
w: [[-0.08799823]
 [-0.02474377]] , b: [ 0.06735684] , loss: 0.250137
w: [[-0.08737189]
 [-0.02471669]] , b: [ 0.06696615] , loss: 0.250135
w: [[-0.08675029]
 [-0.02468863]] , b: [ 0.06657767] , loss: 0.250134
w: [[-0.08613338]
 [-0.02465961]] , b: [ 0.06619139] , loss: 0.250132
w: [[-0.08552112]
 [-0.02462964]] , b: [ 0.06580731] , loss: 0.25013
w: [[-0.08491348]
 [-0.02459875]] , b: [ 0.06542541] , loss: 0.250128
w: [[-0.08431043]
 [-0.02456694]] , b: [ 0.06504568] , loss: 0.250127
w: [[-0.08371193]
 [-0.02453424]] , b: [ 0.0646681] , loss: 0.250125
w: [[-0.08311793]
 [-0.02450066]] , b: [ 0.06429266] , loss: 0.250123
w: [[-0.0825284 ]
 [-0.02446621]] , b: [ 0.06391937] , loss: 0.250122
w: [[-0.08194331]
 [-0.02443091]] , b: [ 0.06354819] , loss: 0.25012
w: [[-0.08136262]
 [-0.02439477]] , b: [ 0.06317913] , loss: 0.250119
w: [[-0.0807863 ]
 [-0.02435782]] , b: [ 0.06281216] , loss: 0.250117
w: [[-0.0802143 ]
 [-0.02432006]] , b: [ 0.06244729] , loss: 0.250116
w: [[-0.07964659]
 [-0.0242815 ]] , b: [ 0.0620845] , loss: 0.250114
w: [[-0.07908315]
 [-0.02424218]] , b: [ 0.06172378] , loss: 0.250113
w: [[-0.07852393]
 [-0.02420209]] , b: [ 0.0613651] , loss: 0.250111
w: [[-0.07796891]
 [-0.02416125]] , b: [ 0.06100847] , loss: 0.25011
w: [[-0.07741804]
 [-0.02411967]] , b: [ 0.06065388] , loss: 0.250108
w: [[-0.07687131]
 [-0.02407738]] , b: [ 0.0603013] , loss: 0.250107
w: [[-0.07632866]
 [-0.02403438]] , b: [ 0.05995075] , loss: 0.250105
w: [[-0.07579008]
 [-0.02399068]] , b: [ 0.05960218] , loss: 0.250104
w: [[-0.07525553]
 [-0.02394631]] , b: [ 0.0592556] , loss: 0.250103
w: [[-0.07472496]
 [-0.02390126]] , b: [ 0.05891101] , loss: 0.250101
w: [[-0.07419837]
 [-0.02385556]] , b: [ 0.05856839] , loss: 0.2501
w: [[-0.07367569]
 [-0.02380921]] , b: [ 0.05822773] , loss: 0.250099
w: [[-0.07315693]
 [-0.02376224]] , b: [ 0.05788902] , loss: 0.250097
w: [[-0.07264204]
 [-0.02371466]] , b: [ 0.05755224] , loss: 0.250096
w: [[-0.07213098]
 [-0.02366646]] , b: [ 0.05721738] , loss: 0.250095
w: [[-0.07162374]
 [-0.02361768]] , b: [ 0.05688444] , loss: 0.250094
w: [[-0.07112027]
 [-0.02356831]] , b: [ 0.0565534] , loss: 0.250093
w: [[-0.07062055]
 [-0.02351837]] , b: [ 0.05622426] , loss: 0.250091
w: [[-0.07012455]
 [-0.02346787]] , b: [ 0.055897] , loss: 0.25009
w: [[-0.06963224]
 [-0.02341683]] , b: [ 0.05557162] , loss: 0.250089
w: [[-0.06914359]
 [-0.02336525]] , b: [ 0.0552481] , loss: 0.250088
w: [[-0.06865857]
 [-0.02331315]] , b: [ 0.05492643] , loss: 0.250087
w: [[-0.06817715]
 [-0.02326054]] , b: [ 0.05460662] , loss: 0.250086
w: [[-0.06769931]
 [-0.02320742]] , b: [ 0.05428863] , loss: 0.250084
w: [[-0.06722501]
 [-0.02315382]] , b: [ 0.05397245] , loss: 0.250083
w: [[-0.06675422]
 [-0.02309974]] , b: [ 0.0536581] , loss: 0.250082
w: [[-0.06628692]
 [-0.02304518]] , b: [ 0.05334555] , loss: 0.250081
w: [[-0.06582309]
 [-0.02299016]] , b: [ 0.0530348] , loss: 0.25008
w: [[-0.06536269]
 [-0.02293469]] , b: [ 0.05272582] , loss: 0.250079
w: [[-0.0649057 ]
 [-0.02287879]] , b: [ 0.05241863] , loss: 0.250078
w: [[-0.06445208]
 [-0.02282245]] , b: [ 0.05211319] , loss: 0.250077
w: [[-0.06400184]
 [-0.0227657 ]] , b: [ 0.0518095] , loss: 0.250076
w: [[-0.06355491]
 [-0.02270853]] , b: [ 0.05150757] , loss: 0.250075
w: [[-0.06311128]
 [-0.02265096]] , b: [ 0.05120736] , loss: 0.250074
w: [[-0.06267093]
 [-0.02259301]] , b: [ 0.05090888] , loss: 0.250073
w: [[-0.06223383]
 [-0.02253467]] , b: [ 0.05061213] , loss: 0.250072
w: [[-0.06179995]
 [-0.02247596]] , b: [ 0.05031707] , loss: 0.250071
w: [[-0.06136927]
 [-0.02241689]] , b: [ 0.05002371] , loss: 0.25007
w: [[-0.06094177]
 [-0.02235746]] , b: [ 0.04973203] , loss: 0.25007
w: [[-0.06051743]
 [-0.0222977 ]] , b: [ 0.04944203] , loss: 0.250069
w: [[-0.0600962 ]
 [-0.02223759]] , b: [ 0.0491537] , loss: 0.250068
w: [[-0.05967807]
 [-0.02217715]] , b: [ 0.04886703] , loss: 0.250067
w: [[-0.05926301]
 [-0.0221164 ]] , b: [ 0.04858201] , loss: 0.250066
w: [[-0.05885101]
 [-0.02205533]] , b: [ 0.04829864] , loss: 0.250065
w: [[-0.05844203]
 [-0.02199396]] , b: [ 0.04801689] , loss: 0.250064
w: [[-0.05803606]
 [-0.02193229]] , b: [ 0.04773678] , loss: 0.250064
w: [[-0.05763306]
 [-0.02187034]] , b: [ 0.04745826] , loss: 0.250063
w: [[-0.05723303]
 [-0.02180811]] , b: [ 0.04718136] , loss: 0.250062
w: [[-0.05683593]
 [-0.02174561]] , b: [ 0.04690606] , loss: 0.250061
w: [[-0.05644174]
 [-0.02168285]] , b: [ 0.04663233] , loss: 0.25006
w: [[-0.05605045]
 [-0.02161983]] , b: [ 0.04636018] , loss: 0.25006
w: [[-0.05566201]
 [-0.02155656]] , b: [ 0.04608961] , loss: 0.250059
w: [[-0.05527642]
 [-0.02149305]] , b: [ 0.0458206] , loss: 0.250058
w: [[-0.05489365]
 [-0.02142931]] , b: [ 0.04555314] , loss: 0.250057
w: [[-0.05451367]
 [-0.02136534]] , b: [ 0.04528723] , loss: 0.250057
w: [[-0.05413648]
 [-0.02130115]] , b: [ 0.04502284] , loss: 0.250056
w: [[-0.05376205]
 [-0.02123675]] , b: [ 0.04475997] , loss: 0.250055
w: [[-0.05339035]
 [-0.02117215]] , b: [ 0.04449864] , loss: 0.250055
w: [[-0.05302136]
 [-0.02110734]] , b: [ 0.04423881] , loss: 0.250054
w: [[-0.05265506]
 [-0.02104235]] , b: [ 0.04398048] , loss: 0.250053
w: [[-0.05229144]
 [-0.02097717]] , b: [ 0.04372364] , loss: 0.250052
w: [[-0.05193046]
 [-0.0209118 ]] , b: [ 0.04346829] , loss: 0.250052
w: [[-0.05157212]
 [-0.02084628]] , b: [ 0.0432144] , loss: 0.250051
w: [[-0.05121639]
 [-0.02078058]] , b: [ 0.04296198] , loss: 0.250051
w: [[-0.05086325]
 [-0.02071473]] , b: [ 0.04271103] , loss: 0.25005
w: [[-0.05051268]
 [-0.02064872]] , b: [ 0.04246153] , loss: 0.250049
w: [[-0.05016465]
 [-0.02058257]] , b: [ 0.04221347] , loss: 0.250049
w: [[-0.04981916]
 [-0.02051627]] , b: [ 0.04196684] , loss: 0.250048
w: [[-0.04947618]
 [-0.02044984]] , b: [ 0.04172164] , loss: 0.250047
w: [[-0.04913569]
 [-0.02038328]] , b: [ 0.04147786] , loss: 0.250047
w: [[-0.04879767]
 [-0.0203166 ]] , b: [ 0.0412355] , loss: 0.250046
w: [[-0.0484621 ]
 [-0.02024979]] , b: [ 0.04099453] , loss: 0.250046
w: [[-0.04812897]
 [-0.02018288]] , b: [ 0.04075496] , loss: 0.250045
w: [[-0.04779825]
 [-0.02011585]] , b: [ 0.04051677] , loss: 0.250044
w: [[-0.04746994]
 [-0.02004873]] , b: [ 0.04027997] , loss: 0.250044
w: [[-0.047144  ]
 [-0.01998151]] , b: [ 0.04004453] , loss: 0.250043
w: [[-0.04682042]
 [-0.0199142 ]] , b: [ 0.03981045] , loss: 0.250043
w: [[-0.04649918]
 [-0.01984681]] , b: [ 0.03957773] , loss: 0.250042
w: [[-0.04618027]
 [-0.01977935]] , b: [ 0.03934635] , loss: 0.250042
w: [[-0.04586366]
 [-0.0197118 ]] , b: [ 0.03911632] , loss: 0.250041
w: [[-0.04554934]
 [-0.01964418]] , b: [ 0.03888763] , loss: 0.250041
w: [[-0.04523729]
 [-0.0195765 ]] , b: [ 0.03866025] , loss: 0.25004
w: [[-0.0449275 ]
 [-0.01950876]] , b: [ 0.0384342] , loss: 0.25004
w: [[-0.04461994]
 [-0.01944095]] , b: [ 0.03820945] , loss: 0.250039
w: [[-0.0443146]
 [-0.0193731]] , b: [ 0.03798601] , loss: 0.250039
w: [[-0.04401146]
 [-0.01930521]] , b: [ 0.03776386] , loss: 0.250038
w: [[-0.0437105 ]
 [-0.01923727]] , b: [ 0.037543] , loss: 0.250038
w: [[-0.04341172]
 [-0.0191693 ]] , b: [ 0.03732342] , loss: 0.250037
w: [[-0.04311508]
 [-0.01910129]] , b: [ 0.03710511] , loss: 0.250037
w: [[-0.04282059]
 [-0.01903325]] , b: [ 0.03688807] , loss: 0.250036
w: [[-0.04252821]
 [-0.01896519]] , b: [ 0.0366723] , loss: 0.250036
w: [[-0.04223793]
 [-0.01889711]] , b: [ 0.03645777] , loss: 0.250035
w: [[-0.04194974]
 [-0.01882901]] , b: [ 0.03624449] , loss: 0.250035
w: [[-0.04166362]
 [-0.0187609 ]] , b: [ 0.03603245] , loss: 0.250035
w: [[-0.04137956]
 [-0.01869279]] , b: [ 0.03582163] , loss: 0.250034
w: [[-0.04109754]
 [-0.01862467]] , b: [ 0.03561203] , loss: 0.250034
w: [[-0.04081755]
 [-0.01855656]] , b: [ 0.03540365] , loss: 0.250033
w: [[-0.04053956]
 [-0.01848844]] , b: [ 0.03519649] , loss: 0.250033
w: [[-0.04026356]
 [-0.01842033]] , b: [ 0.03499053] , loss: 0.250032
w: [[-0.03998954]
 [-0.01835224]] , b: [ 0.03478577] , loss: 0.250032
w: [[-0.03971749]
 [-0.01828416]] , b: [ 0.0345822] , loss: 0.250032
w: [[-0.03944738]
 [-0.01821609]] , b: [ 0.03437981] , loss: 0.250031
w: [[-0.03917921]
 [-0.01814806]] , b: [ 0.03417859] , loss: 0.250031
w: [[-0.03891296]
 [-0.01808004]] , b: [ 0.03397854] , loss: 0.250031
w: [[-0.03864861]
 [-0.01801206]] , b: [ 0.03377966] , loss: 0.25003
w: [[-0.03838615]
 [-0.0179441 ]] , b: [ 0.03358193] , loss: 0.25003
w: [[-0.03812556]
 [-0.01787619]] , b: [ 0.03338535] , loss: 0.250029
w: [[-0.03786684]
 [-0.01780832]] , b: [ 0.03318991] , loss: 0.250029
w: [[-0.03760996]
 [-0.01774048]] , b: [ 0.0329956] , loss: 0.250029
w: [[-0.03735492]
 [-0.0176727 ]] , b: [ 0.03280243] , loss: 0.250028
w: [[-0.03710169]
 [-0.01760496]] , b: [ 0.03261038] , loss: 0.250028
w: [[-0.03685027]
 [-0.01753728]] , b: [ 0.03241945] , loss: 0.250028
w: [[-0.03660065]
 [-0.01746965]] , b: [ 0.03222962] , loss: 0.250027
w: [[-0.03635281]
 [-0.01740208]] , b: [ 0.0320409] , loss: 0.250027
w: [[-0.03610672]
 [-0.01733456]] , b: [ 0.03185328] , loss: 0.250027
w: [[-0.03586239]
 [-0.01726711]] , b: [ 0.03166676] , loss: 0.250026
w: [[-0.0356198 ]
 [-0.01719973]] , b: [ 0.03148131] , loss: 0.250026
w: [[-0.03537894]
 [-0.01713242]] , b: [ 0.03129694] , loss: 0.250026
w: [[-0.03513978]
 [-0.01706518]] , b: [ 0.03111365] , loss: 0.250025
w: [[-0.03490233]
 [-0.01699802]] , b: [ 0.03093142] , loss: 0.250025
w: [[-0.03466656]
 [-0.01693093]] , b: [ 0.03075025] , loss: 0.250025
w: [[-0.03443247]
 [-0.01686392]] , b: [ 0.03057014] , loss: 0.250024
w: [[-0.03420003]
 [-0.016797  ]] , b: [ 0.03039108] , loss: 0.250024
w: [[-0.03396925]
 [-0.01673016]] , b: [ 0.03021306] , loss: 0.250024
w: [[-0.0337401]
 [-0.0166634]] , b: [ 0.03003608] , loss: 0.250023
w: [[-0.03351258]
 [-0.01659674]] , b: [ 0.02986013] , loss: 0.250023
w: [[-0.03328667]
 [-0.01653017]] , b: [ 0.0296852] , loss: 0.250023
w: [[-0.03306235]
 [-0.0164637 ]] , b: [ 0.02951129] , loss: 0.250023
w: [[-0.03283963]
 [-0.01639731]] , b: [ 0.0293384] , loss: 0.250022
w: [[-0.03261848]
 [-0.01633104]] , b: [ 0.0291665] , loss: 0.250022
w: [[-0.0323989 ]
 [-0.01626486]] , b: [ 0.02899561] , loss: 0.250022
w: [[-0.03218087]
 [-0.01619878]] , b: [ 0.02882573] , loss: 0.250022
w: [[-0.03196438]
 [-0.01613281]] , b: [ 0.02865682] , loss: 0.250021
w: [[-0.03174943]
 [-0.01606696]] , b: [ 0.0284889] , loss: 0.250021
w: [[-0.03153599]
 [-0.01600121]] , b: [ 0.02832196] , loss: 0.250021
w: [[-0.03132405]
 [-0.01593557]] , b: [ 0.028156] , loss: 0.250021
w: [[-0.03111362]
 [-0.01587005]] , b: [ 0.02799099] , loss: 0.25002
w: [[-0.03090466]
 [-0.01580464]] , b: [ 0.02782695] , loss: 0.25002
w: [[-0.03069718]
 [-0.01573936]] , b: [ 0.02766387] , loss: 0.25002
w: [[-0.03049116]
 [-0.01567418]] , b: [ 0.02750174] , loss: 0.25002
w: [[-0.0302866 ]
 [-0.01560913]] , b: [ 0.02734056] , loss: 0.250019
w: [[-0.03008347]
 [-0.01554421]] , b: [ 0.02718031] , loss: 0.250019
w: [[-0.02988178]
 [-0.01547941]] , b: [ 0.027021] , loss: 0.250019
w: [[-0.0296815 ]
 [-0.01541473]] , b: [ 0.02686261] , loss: 0.250019
w: [[-0.02948263]
 [-0.01535019]] , b: [ 0.02670516] , loss: 0.250018
w: [[-0.02928516]
 [-0.01528577]] , b: [ 0.02654862] , loss: 0.250018
w: [[-0.02908908]
 [-0.01522149]] , b: [ 0.02639298] , loss: 0.250018
w: [[-0.02889438]
 [-0.01515734]] , b: [ 0.02623826] , loss: 0.250018
w: [[-0.02870104]
 [-0.01509333]] , b: [ 0.02608444] , loss: 0.250017
w: [[-0.02850907]
 [-0.01502945]] , b: [ 0.02593151] , loss: 0.250017
w: [[-0.02831843]
 [-0.01496571]] , b: [ 0.02577948] , loss: 0.250017
w: [[-0.02812914]
 [-0.01490211]] , b: [ 0.02562834] , loss: 0.250017
w: [[-0.02794116]
 [-0.01483865]] , b: [ 0.02547809] , loss: 0.250017
w: [[-0.02775451]
 [-0.01477533]] , b: [ 0.02532871] , loss: 0.250016
w: [[-0.02756917]
 [-0.01471215]] , b: [ 0.0251802] , loss: 0.250016
w: [[-0.02738512]
 [-0.01464912]] , b: [ 0.02503255] , loss: 0.250016
w: [[-0.02720236]
 [-0.01458624]] , b: [ 0.02488577] , loss: 0.250016
w: [[-0.02702088]
 [-0.0145235 ]] , b: [ 0.02473985] , loss: 0.250016
w: [[-0.02684066]
 [-0.01446091]] , b: [ 0.02459478] , loss: 0.250015
w: [[-0.02666171]
 [-0.01439848]] , b: [ 0.02445055] , loss: 0.250015
w: [[-0.02648401]
 [-0.01433619]] , b: [ 0.02430717] , loss: 0.250015
w: [[-0.02630754]
 [-0.01427405]] , b: [ 0.02416462] , loss: 0.250015
w: [[-0.02613232]
 [-0.01421207]] , b: [ 0.02402291] , loss: 0.250015
w: [[-0.02595831]
 [-0.01415025]] , b: [ 0.02388202] , loss: 0.250015
w: [[-0.02578553]
 [-0.01408858]] , b: [ 0.02374195] , loss: 0.250014
w: [[-0.02561394]
 [-0.01402707]] , b: [ 0.02360271] , loss: 0.250014
w: [[-0.02544355]
 [-0.01396572]] , b: [ 0.02346427] , loss: 0.250014
w: [[-0.02527435]
 [-0.01390452]] , b: [ 0.02332664] , loss: 0.250014
w: [[-0.02510633]
 [-0.01384349]] , b: [ 0.02318982] , loss: 0.250014
w: [[-0.02493948]
 [-0.01378261]] , b: [ 0.0230538] , loss: 0.250013
w: [[-0.02477379]
 [-0.0137219 ]] , b: [ 0.02291857] , loss: 0.250013
w: [[-0.02460925]
 [-0.01366136]] , b: [ 0.02278414] , loss: 0.250013
w: [[-0.02444586]
 [-0.01360098]] , b: [ 0.02265048] , loss: 0.250013
w: [[-0.02428361]
 [-0.01354076]] , b: [ 0.0225176] , loss: 0.250013
w: [[-0.02412248]
 [-0.01348071]] , b: [ 0.02238551] , loss: 0.250013
w: [[-0.02396247]
 [-0.01342082]] , b: [ 0.02225419] , loss: 0.250013
w: [[-0.02380357]
 [-0.0133611 ]] , b: [ 0.02212363] , loss: 0.250012
w: [[-0.02364578]
 [-0.01330154]] , b: [ 0.02199384] , loss: 0.250012
w: [[-0.02348908]
 [-0.01324217]] , b: [ 0.0218648] , loss: 0.250012
w: [[-0.02333347]
 [-0.01318295]] , b: [ 0.02173652] , loss: 0.250012
w: [[-0.02317894]
 [-0.01312391]] , b: [ 0.02160899] , loss: 0.250012
w: [[-0.02302547]
 [-0.01306503]] , b: [ 0.02148221] , loss: 0.250012
w: [[-0.02287307]
 [-0.01300633]] , b: [ 0.02135617] , loss: 0.250012
w: [[-0.02272172]
 [-0.01294779]] , b: [ 0.02123087] , loss: 0.250011
w: [[-0.02257142]
 [-0.01288943]] , b: [ 0.0211063] , loss: 0.250011
w: [[-0.02242216]
 [-0.01283125]] , b: [ 0.02098246] , loss: 0.250011
w: [[-0.02227394]
 [-0.01277324]] , b: [ 0.02085934] , loss: 0.250011
w: [[-0.02212674]
 [-0.01271541]] , b: [ 0.02073693] , loss: 0.250011
w: [[-0.02198056]
 [-0.01265775]] , b: [ 0.02061525] , loss: 0.250011
w: [[-0.02183539]
 [-0.01260026]] , b: [ 0.02049427] , loss: 0.250011
w: [[-0.02169122]
 [-0.01254295]] , b: [ 0.020374] , loss: 0.25001
w: [[-0.02154805]
 [-0.01248583]] , b: [ 0.02025443] , loss: 0.25001
w: [[-0.02140587]
 [-0.01242888]] , b: [ 0.02013556] , loss: 0.25001
w: [[-0.02126467]
 [-0.0123721 ]] , b: [ 0.02001739] , loss: 0.25001
w: [[-0.02112444]
 [-0.01231551]] , b: [ 0.01989991] , loss: 0.25001
w: [[-0.02098518]
 [-0.01225909]] , b: [ 0.01978312] , loss: 0.25001
w: [[-0.02084688]
 [-0.01220285]] , b: [ 0.01966701] , loss: 0.25001
w: [[-0.02070953]
 [-0.0121468 ]] , b: [ 0.01955158] , loss: 0.25001
w: [[-0.02057313]
 [-0.01209091]] , b: [ 0.01943683] , loss: 0.250009
w: [[-0.02043766]
 [-0.01203521]] , b: [ 0.01932275] , loss: 0.250009
w: [[-0.02030313]
 [-0.0119797 ]] , b: [ 0.01920933] , loss: 0.250009
w: [[-0.02016952]
 [-0.01192436]] , b: [ 0.01909658] , loss: 0.250009
w: [[-0.02003683]
 [-0.01186921]] , b: [ 0.01898449] , loss: 0.250009
w: [[-0.01990506]
 [-0.01181424]] , b: [ 0.01887305] , loss: 0.250009
w: [[-0.01977419]
 [-0.01175945]] , b: [ 0.01876226] , loss: 0.250009
w: [[-0.01964422]
 [-0.01170484]] , b: [ 0.01865213] , loss: 0.250009
w: [[-0.01951514]
 [-0.01165042]] , b: [ 0.01854263] , loss: 0.250009
w: [[-0.01938695]
 [-0.01159618]] , b: [ 0.01843379] , loss: 0.250008
w: [[-0.01925963]
 [-0.01154211]] , b: [ 0.01832557] , loss: 0.250008
w: [[-0.0191332 ]
 [-0.01148824]] , b: [ 0.01821799] , loss: 0.250008
w: [[-0.01900763]
 [-0.01143454]] , b: [ 0.01811104] , loss: 0.250008
w: [[-0.01888292]
 [-0.01138103]] , b: [ 0.01800472] , loss: 0.250008
w: [[-0.01875906]
 [-0.0113277 ]] , b: [ 0.01789903] , loss: 0.250008
w: [[-0.01863604]
 [-0.01127455]] , b: [ 0.01779394] , loss: 0.250008
w: [[-0.01851388]
 [-0.0112216 ]] , b: [ 0.01768948] , loss: 0.250008
w: [[-0.01839255]
 [-0.01116882]] , b: [ 0.01758562] , loss: 0.250008
w: [[-0.01827204]
 [-0.01111623]] , b: [ 0.01748237] , loss: 0.250008
w: [[-0.01815237]
 [-0.01106383]] , b: [ 0.01737974] , loss: 0.250008
w: [[-0.01803351]
 [-0.01101161]] , b: [ 0.01727769] , loss: 0.250007
w: [[-0.01791546]
 [-0.01095957]] , b: [ 0.01717625] , loss: 0.250007
w: [[-0.01779822]
 [-0.01090772]] , b: [ 0.01707541] , loss: 0.250007
w: [[-0.01768178]
 [-0.01085605]] , b: [ 0.01697515] , loss: 0.250007
w: [[-0.01756614]
 [-0.01080457]] , b: [ 0.01687548] , loss: 0.250007
w: [[-0.01745129]
 [-0.01075328]] , b: [ 0.01677639] , loss: 0.250007
w: [[-0.01733722]
 [-0.01070217]] , b: [ 0.01667789] , loss: 0.250007
w: [[-0.01722392]
 [-0.01065124]] , b: [ 0.01657996] , loss: 0.250007
w: [[-0.0171114]
 [-0.0106005]] , b: [ 0.0164826] , loss: 0.250007
w: [[-0.01699965]
 [-0.01054995]] , b: [ 0.0163858] , loss: 0.250007
w: [[-0.01688866]
 [-0.01049958]] , b: [ 0.01628958] , loss: 0.250007
w: [[-0.01677843]
 [-0.0104494 ]] , b: [ 0.01619392] , loss: 0.250006
w: [[-0.01666894]
 [-0.0103994 ]] , b: [ 0.01609882] , loss: 0.250006
w: [[-0.01656019]
 [-0.01034958]] , b: [ 0.01600428] , loss: 0.250006
w: [[-0.01645219]
 [-0.01029995]] , b: [ 0.0159103] , loss: 0.250006
w: [[-0.01634492]
 [-0.0102505 ]] , b: [ 0.01581687] , loss: 0.250006
w: [[-0.01623838]
 [-0.01020124]] , b: [ 0.01572398] , loss: 0.250006
w: [[-0.01613257]
 [-0.01015217]] , b: [ 0.01563163] , loss: 0.250006
w: [[-0.01602747]
 [-0.01010328]] , b: [ 0.01553983] , loss: 0.250006
w: [[-0.01592308]
 [-0.01005457]] , b: [ 0.01544857] , loss: 0.250006
w: [[-0.0158194 ]
 [-0.01000604]] , b: [ 0.01535785] , loss: 0.250006
w: [[-0.01571643]
 [-0.0099577 ]] , b: [ 0.01526765] , loss: 0.250006
w: [[-0.01561416]
 [-0.00990955]] , b: [ 0.01517798] , loss: 0.250006
w: [[-0.01551258]
 [-0.00986159]] , b: [ 0.01508883] , loss: 0.250006
w: [[-0.01541169]
 [-0.00981381]] , b: [ 0.01500021] , loss: 0.250006
w: [[-0.01531149]
 [-0.00976621]] , b: [ 0.01491211] , loss: 0.250006
w: [[-0.01521196]
 [-0.00971879]] , b: [ 0.01482452] , loss: 0.250005
w: [[-0.0151131 ]
 [-0.00967155]] , b: [ 0.01473745] , loss: 0.250005
w: [[-0.01501492]
 [-0.0096245 ]] , b: [ 0.01465089] , loss: 0.250005
w: [[-0.0149174 ]
 [-0.00957764]] , b: [ 0.01456483] , loss: 0.250005
w: [[-0.01482054]
 [-0.00953095]] , b: [ 0.01447928] , loss: 0.250005
w: [[-0.01472434]
 [-0.00948445]] , b: [ 0.01439423] , loss: 0.250005
w: [[-0.01462878]
 [-0.00943813]] , b: [ 0.01430969] , loss: 0.250005
w: [[-0.01453387]
 [-0.009392  ]] , b: [ 0.01422563] , loss: 0.250005
w: [[-0.0144396 ]
 [-0.00934604]] , b: [ 0.01414207] , loss: 0.250005
w: [[-0.01434597]
 [-0.00930027]] , b: [ 0.014059] , loss: 0.250005
w: [[-0.01425297]
 [-0.00925468]] , b: [ 0.01397641] , loss: 0.250005
w: [[-0.0141606 ]
 [-0.00920927]] , b: [ 0.0138943] , loss: 0.250005
w: [[-0.01406885]
 [-0.00916405]] , b: [ 0.01381268] , loss: 0.250005
w: [[-0.01397772]
 [-0.009119  ]] , b: [ 0.01373154] , loss: 0.250005
w: [[-0.01388721]
 [-0.00907414]] , b: [ 0.01365087] , loss: 0.250005
w: [[-0.0137973 ]
 [-0.00902945]] , b: [ 0.01357068] , loss: 0.250005
w: [[-0.013708  ]
 [-0.00898494]] , b: [ 0.01349095] , loss: 0.250004
w: [[-0.0136193 ]
 [-0.00894062]] , b: [ 0.0134117] , loss: 0.250004
w: [[-0.0135312 ]
 [-0.00889648]] , b: [ 0.0133329] , loss: 0.250004
w: [[-0.01344369]
 [-0.00885252]] , b: [ 0.01325457] , loss: 0.250004
w: [[-0.01335677]
 [-0.00880872]] , b: [ 0.0131767] , loss: 0.250004
w: [[-0.01327043]
 [-0.00876511]] , b: [ 0.0130993] , loss: 0.250004
w: [[-0.01318467]
 [-0.00872168]] , b: [ 0.01302234] , loss: 0.250004
w: [[-0.0130995 ]
 [-0.00867843]] , b: [ 0.01294583] , loss: 0.250004
w: [[-0.01301489]
 [-0.00863536]] , b: [ 0.01286976] , loss: 0.250004
w: [[-0.01293086]
 [-0.00859247]] , b: [ 0.01279415] , loss: 0.250004
w: [[-0.01284739]
 [-0.00854975]] , b: [ 0.01271898] , loss: 0.250004
w: [[-0.01276447]
 [-0.00850721]] , b: [ 0.01264424] , loss: 0.250004
w: [[-0.01268212]
 [-0.00846485]] , b: [ 0.01256995] , loss: 0.250004
w: [[-0.01260031]
 [-0.00842266]] , b: [ 0.0124961] , loss: 0.250004
w: [[-0.01251906]
 [-0.00838065]] , b: [ 0.01242268] , loss: 0.250004
w: [[-0.01243835]
 [-0.00833882]] , b: [ 0.01234968] , loss: 0.250004
w: [[-0.01235818]
 [-0.00829716]] , b: [ 0.01227711] , loss: 0.250004
w: [[-0.01227855]
 [-0.00825568]] , b: [ 0.01220497] , loss: 0.250004
w: [[-0.01219945]
 [-0.00821436]] , b: [ 0.01213326] , loss: 0.250004
w: [[-0.01212087]
 [-0.00817322]] , b: [ 0.01206197] , loss: 0.250004
w: [[-0.01204283]
 [-0.00813225]] , b: [ 0.01199109] , loss: 0.250004
w: [[-0.01196531]
 [-0.00809146]] , b: [ 0.01192063] , loss: 0.250003
w: [[-0.01188831]
 [-0.00805085]] , b: [ 0.01185057] , loss: 0.250003
w: [[-0.01181182]
 [-0.00801041]] , b: [ 0.01178094] , loss: 0.250003
w: [[-0.01173584]
 [-0.00797013]] , b: [ 0.01171171] , loss: 0.250003
w: [[-0.01166037]
 [-0.00793003]] , b: [ 0.01164289] , loss: 0.250003
w: [[-0.01158541]
 [-0.0078901 ]] , b: [ 0.01157447] , loss: 0.250003
w: [[-0.01151094]
 [-0.00785035]] , b: [ 0.01150645] , loss: 0.250003
w: [[-0.01143697]
 [-0.00781076]] , b: [ 0.01143884] , loss: 0.250003
w: [[-0.0113635 ]
 [-0.00777134]] , b: [ 0.01137162] , loss: 0.250003
w: [[-0.01129051]
 [-0.00773209]] , b: [ 0.01130479] , loss: 0.250003
w: [[-0.01121802]
 [-0.00769302]] , b: [ 0.01123835] , loss: 0.250003
w: [[-0.011146 ]
 [-0.0076541]] , b: [ 0.01117231] , loss: 0.250003
w: [[-0.01107446]
 [-0.00761536]] , b: [ 0.01110666] , loss: 0.250003
w: [[-0.0110034 ]
 [-0.00757679]] , b: [ 0.01104139] , loss: 0.250003
w: [[-0.01093281]
 [-0.00753838]] , b: [ 0.01097651] , loss: 0.250003
w: [[-0.0108627 ]
 [-0.00750014]] , b: [ 0.010912] , loss: 0.250003
w: [[-0.01079305]
 [-0.00746207]] , b: [ 0.01084787] , loss: 0.250003
w: [[-0.01072386]
 [-0.00742416]] , b: [ 0.01078412] , loss: 0.250003
w: [[-0.01065514]
 [-0.00738642]] , b: [ 0.01072074] , loss: 0.250003
w: [[-0.01058687]
 [-0.00734886]] , b: [ 0.01065772] , loss: 0.250003
w: [[-0.01051906]
 [-0.00731145]] , b: [ 0.01059509] , loss: 0.250003
w: [[-0.01045169]
 [-0.0072742 ]] , b: [ 0.01053282] , loss: 0.250003
w: [[-0.01038478]
 [-0.00723713]] , b: [ 0.01047091] , loss: 0.250003
w: [[-0.01031831]
 [-0.00720022]] , b: [ 0.01040937] , loss: 0.250003
w: [[-0.01025228]
 [-0.00716347]] , b: [ 0.01034819] , loss: 0.250003
w: [[-0.01018669]
 [-0.00712688]] , b: [ 0.01028736] , loss: 0.250003
w: [[-0.01012153]
 [-0.00709046]] , b: [ 0.0102269] , loss: 0.250003
w: [[-0.01005681]
 [-0.00705419]] , b: [ 0.01016679] , loss: 0.250003
w: [[-0.00999251]
 [-0.00701809]] , b: [ 0.01010704] , loss: 0.250003
w: [[-0.00992864]
 [-0.00698215]] , b: [ 0.01004763] , loss: 0.250002
w: [[-0.00986519]
 [-0.00694637]] , b: [ 0.00998857] , loss: 0.250002
w: [[-0.00980216]
 [-0.00691075]] , b: [ 0.00992987] , loss: 0.250002
w: [[-0.00973955]
 [-0.00687529]] , b: [ 0.0098715] , loss: 0.250002
w: [[-0.00967736]
 [-0.00683999]] , b: [ 0.00981348] , loss: 0.250002
w: [[-0.00961558]
 [-0.00680485]] , b: [ 0.00975579] , loss: 0.250002
w: [[-0.00955421]
 [-0.00676987]] , b: [ 0.00969844] , loss: 0.250002
w: [[-0.00949324]
 [-0.00673505]] , b: [ 0.00964144] , loss: 0.250002
w: [[-0.00943268]
 [-0.00670039]] , b: [ 0.00958476] , loss: 0.250002
w: [[-0.00937251]
 [-0.00666587]] , b: [ 0.00952843] , loss: 0.250002
w: [[-0.00931274]
 [-0.00663152]] , b: [ 0.00947242] , loss: 0.250002
w: [[-0.00925337]
 [-0.00659732]] , b: [ 0.00941675] , loss: 0.250002
w: [[-0.00919439]
 [-0.00656328]] , b: [ 0.00936139] , loss: 0.250002
w: [[-0.00913579]
 [-0.00652939]] , b: [ 0.00930637] , loss: 0.250002
w: [[-0.00907759]
 [-0.00649566]] , b: [ 0.00925166] , loss: 0.250002
w: [[-0.00901977]
 [-0.00646208]] , b: [ 0.00919727] , loss: 0.250002
w: [[-0.00896234]
 [-0.00642866]] , b: [ 0.00914321] , loss: 0.250002
w: [[-0.00890528]
 [-0.00639539]] , b: [ 0.00908946] , loss: 0.250002
w: [[-0.00884859]
 [-0.00636227]] , b: [ 0.00903603] , loss: 0.250002
w: [[-0.00879228]
 [-0.0063293 ]] , b: [ 0.00898292] , loss: 0.250002
w: [[-0.00873635]
 [-0.00629649]] , b: [ 0.00893011] , loss: 0.250002
w: [[-0.00868078]
 [-0.00626383]] , b: [ 0.00887762] , loss: 0.250002
w: [[-0.00862557]
 [-0.00623132]] , b: [ 0.00882543] , loss: 0.250002
w: [[-0.00857073]
 [-0.00619895]] , b: [ 0.00877355] , loss: 0.250002
w: [[-0.00851625]
 [-0.00616674]] , b: [ 0.00872197] , loss: 0.250002
w: [[-0.00846213]
 [-0.00613467]] , b: [ 0.0086707] , loss: 0.250002
w: [[-0.00840836]
 [-0.00610276]] , b: [ 0.00861973] , loss: 0.250002
w: [[-0.00835495]
 [-0.00607099]] , b: [ 0.00856906] , loss: 0.250002
w: [[-0.00830188]
 [-0.00603936]] , b: [ 0.00851868] , loss: 0.250002
w: [[-0.00824917]
 [-0.00600789]] , b: [ 0.0084686] , loss: 0.250002
w: [[-0.00819681]
 [-0.00597657]] , b: [ 0.00841881] , loss: 0.250002
w: [[-0.00814478]
 [-0.00594539]] , b: [ 0.00836932] , loss: 0.250002
w: [[-0.0080931 ]
 [-0.00591435]] , b: [ 0.00832011] , loss: 0.250002
w: [[-0.00804176]
 [-0.00588347]] , b: [ 0.0082712] , loss: 0.250002
w: [[-0.00799076]
 [-0.00585272]] , b: [ 0.00822257] , loss: 0.250002
w: [[-0.00794009]
 [-0.00582212]] , b: [ 0.00817423] , loss: 0.250002
w: [[-0.00788974]
 [-0.00579166]] , b: [ 0.00812618] , loss: 0.250002
w: [[-0.00783974]
 [-0.00576135]] , b: [ 0.0080784] , loss: 0.250002
w: [[-0.00779006]
 [-0.00573118]] , b: [ 0.00803091] , loss: 0.250002
w: [[-0.0077407 ]
 [-0.00570115]] , b: [ 0.0079837] , loss: 0.250002
w: [[-0.00769166]
 [-0.00567126]] , b: [ 0.00793676] , loss: 0.250002
w: [[-0.00764295]
 [-0.00564152]] , b: [ 0.0078901] , loss: 0.250001
w: [[-0.00759456]
 [-0.00561191]] , b: [ 0.00784371] , loss: 0.250001
w: [[-0.00754648]
 [-0.00558244]] , b: [ 0.00779759] , loss: 0.250001
w: [[-0.00749873]
 [-0.00555312]] , b: [ 0.00775175] , loss: 0.250001
w: [[-0.00745127]
 [-0.00552393]] , b: [ 0.00770617] , loss: 0.250001
w: [[-0.00740414]
 [-0.00549489]] , b: [ 0.00766086] , loss: 0.250001
w: [[-0.00735731]
 [-0.00546598]] , b: [ 0.00761582] , loss: 0.250001
w: [[-0.00731079]
 [-0.00543721]] , b: [ 0.00757103] , loss: 0.250001
w: [[-0.00726457]
 [-0.00540858]] , b: [ 0.00752652] , loss: 0.250001
w: [[-0.00721865]
 [-0.00538008]] , b: [ 0.00748227] , loss: 0.250001
w: [[-0.00717303]
 [-0.00535172]] , b: [ 0.00743827] , loss: 0.250001
w: [[-0.00712772]
 [-0.0053235 ]] , b: [ 0.00739453] , loss: 0.250001
w: [[-0.0070827 ]
 [-0.00529541]] , b: [ 0.00735105] , loss: 0.250001
w: [[-0.00703797]
 [-0.00526745]] , b: [ 0.00730783] , loss: 0.250001
w: [[-0.00699353]
 [-0.00523963]] , b: [ 0.00726486] , loss: 0.250001
w: [[-0.00694938]
 [-0.00521195]] , b: [ 0.00722214] , loss: 0.250001
w: [[-0.00690552]
 [-0.00518439]] , b: [ 0.00717968] , loss: 0.250001
w: [[-0.00686194]
 [-0.00515696]] , b: [ 0.00713746] , loss: 0.250001
w: [[-0.00681865]
 [-0.00512968]] , b: [ 0.0070955] , loss: 0.250001
w: [[-0.00677564]
 [-0.00510252]] , b: [ 0.00705378] , loss: 0.250001
w: [[-0.00673291]
 [-0.00507549]] , b: [ 0.00701231] , loss: 0.250001
w: [[-0.00669046]
 [-0.0050486 ]] , b: [ 0.00697107] , loss: 0.250001
w: [[-0.00664828]
 [-0.00502183]] , b: [ 0.00693007] , loss: 0.250001
w: [[-0.00660639]
 [-0.0049952 ]] , b: [ 0.00688933] , loss: 0.250001
w: [[-0.00656475]
 [-0.00496869]] , b: [ 0.00684882] , loss: 0.250001
w: [[-0.0065234 ]
 [-0.00494231]] , b: [ 0.00680855] , loss: 0.250001
w: [[-0.00648231]
 [-0.00491606]] , b: [ 0.00676851] , loss: 0.250001
w: [[-0.00644149]
 [-0.00488994]] , b: [ 0.00672871] , loss: 0.250001
w: [[-0.00640093]
 [-0.00486394]] , b: [ 0.00668915] , loss: 0.250001
w: [[-0.00636063]
 [-0.00483807]] , b: [ 0.00664982] , loss: 0.250001
w: [[-0.0063206 ]
 [-0.00481233]] , b: [ 0.00661072] , loss: 0.250001
w: [[-0.00628083]
 [-0.00478672]] , b: [ 0.00657184] , loss: 0.250001
w: [[-0.00624132]
 [-0.00476123]] , b: [ 0.0065332] , loss: 0.250001
w: [[-0.00620206]
 [-0.00473586]] , b: [ 0.00649478] , loss: 0.250001
w: [[-0.00616305]
 [-0.00471061]] , b: [ 0.00645659] , loss: 0.250001
w: [[-0.0061243 ]
 [-0.00468549]] , b: [ 0.00641863] , loss: 0.250001
w: [[-0.00608581]
 [-0.0046605 ]] , b: [ 0.00638087] , loss: 0.250001
w: [[-0.00604756]
 [-0.00463562]] , b: [ 0.00634335] , loss: 0.250001
w: [[-0.00600956]
 [-0.00461087]] , b: [ 0.00630605] , loss: 0.250001
w: [[-0.0059718 ]
 [-0.00458624]] , b: [ 0.00626897] , loss: 0.250001
w: [[-0.0059343 ]
 [-0.00456174]] , b: [ 0.0062321] , loss: 0.250001
w: [[-0.00589703]
 [-0.00453735]] , b: [ 0.00619545] , loss: 0.250001
w: [[-0.00586001]
 [-0.00451309]] , b: [ 0.00615902] , loss: 0.250001
w: [[-0.00582322]
 [-0.00448894]] , b: [ 0.00612279] , loss: 0.250001
w: [[-0.00578668]
 [-0.00446492]] , b: [ 0.00608679] , loss: 0.250001
w: [[-0.00575036]
 [-0.00444101]] , b: [ 0.006051] , loss: 0.250001
w: [[-0.00571429]
 [-0.00441722]] , b: [ 0.00601541] , loss: 0.250001
w: [[-0.00567844]
 [-0.00439354]] , b: [ 0.00598004] , loss: 0.250001
w: [[-0.00564283]
 [-0.00436999]] , b: [ 0.00594487] , loss: 0.250001
w: [[-0.00560745]
 [-0.00434655]] , b: [ 0.00590991] , loss: 0.250001
w: [[-0.0055723 ]
 [-0.00432323]] , b: [ 0.00587515] , loss: 0.250001
w: [[-0.00553737]
 [-0.00430003]] , b: [ 0.0058406] , loss: 0.250001
w: [[-0.00550267]
 [-0.00427694]] , b: [ 0.00580625] , loss: 0.250001
w: [[-0.00546819]
 [-0.00425396]] , b: [ 0.0057721] , loss: 0.250001
w: [[-0.00543394]
 [-0.00423111]] , b: [ 0.00573816] , loss: 0.250001
w: [[-0.00539991]
 [-0.00420836]] , b: [ 0.00570441] , loss: 0.250001
w: [[-0.00536609]
 [-0.00418573]] , b: [ 0.00567086] , loss: 0.250001
w: [[-0.00533249]
 [-0.0041632 ]] , b: [ 0.00563752] , loss: 0.250001
w: [[-0.00529911]
 [-0.00414079]] , b: [ 0.00560437] , loss: 0.250001
w: [[-0.00526595]
 [-0.0041185 ]] , b: [ 0.0055714] , loss: 0.250001
w: [[-0.00523299]
 [-0.00409631]] , b: [ 0.00553864] , loss: 0.250001
w: [[-0.00520025]
 [-0.00407423]] , b: [ 0.00550607] , loss: 0.250001
w: [[-0.00516772]
 [-0.00405227]] , b: [ 0.00547369] , loss: 0.250001
w: [[-0.00513541]
 [-0.00403042]] , b: [ 0.00544149] , loss: 0.250001
w: [[-0.00510329]
 [-0.00400868]] , b: [ 0.00540949] , loss: 0.250001
w: [[-0.00507139]
 [-0.00398705]] , b: [ 0.00537768] , loss: 0.250001
w: [[-0.0050397 ]
 [-0.00396553]] , b: [ 0.00534605] , loss: 0.250001
w: [[-0.0050082 ]
 [-0.00394411]] , b: [ 0.00531461] , loss: 0.250001
w: [[-0.0049769]
 [-0.0039228]] , b: [ 0.00528335] , loss: 0.250001
w: [[-0.00494581]
 [-0.0039016 ]] , b: [ 0.00525228] , loss: 0.250001
w: [[-0.00491492]
 [-0.0038805 ]] , b: [ 0.00522139] , loss: 0.250001
w: [[-0.00488423]
 [-0.00385951]] , b: [ 0.00519068] , loss: 0.250001
w: [[-0.00485373]
 [-0.00383863]] , b: [ 0.00516016] , loss: 0.250001
w: [[-0.00482343]
 [-0.00381785]] , b: [ 0.00512981] , loss: 0.250001
w: [[-0.00479333]
 [-0.00379718]] , b: [ 0.00509964] , loss: 0.250001
w: [[-0.00476341]
 [-0.00377661]] , b: [ 0.00506965] , loss: 0.250001
w: [[-0.00473369]
 [-0.00375615]] , b: [ 0.00503983] , loss: 0.250001
w: [[-0.00470416]
 [-0.00373579]] , b: [ 0.00501019] , loss: 0.250001
w: [[-0.00467482]
 [-0.00371554]] , b: [ 0.00498072] , loss: 0.250001
w: [[-0.00464566]
 [-0.00369538]] , b: [ 0.00495143] , loss: 0.250001
w: [[-0.0046167 ]
 [-0.00367533]] , b: [ 0.0049223] , loss: 0.250001
w: [[-0.00458791]
 [-0.00365538]] , b: [ 0.00489335] , loss: 0.250001
w: [[-0.00455932]
 [-0.00363554]] , b: [ 0.00486457] , loss: 0.250001
w: [[-0.00453091]
 [-0.00361579]] , b: [ 0.00483596] , loss: 0.250001
w: [[-0.00450267]
 [-0.00359614]] , b: [ 0.00480752] , loss: 0.250001
w: [[-0.00447462]
 [-0.0035766 ]] , b: [ 0.00477925] , loss: 0.250001
w: [[-0.00444675]
 [-0.00355716]] , b: [ 0.00475113] , loss: 0.250001
w: [[-0.00441906]
 [-0.00353781]] , b: [ 0.00472319] , loss: 0.250001
w: [[-0.00439155]
 [-0.00351856]] , b: [ 0.0046954] , loss: 0.250001
w: [[-0.00436421]
 [-0.00349942]] , b: [ 0.00466778] , loss: 0.250001
w: [[-0.00433705]
 [-0.00348037]] , b: [ 0.00464033] , loss: 0.250001
w: [[-0.00431006]
 [-0.00346142]] , b: [ 0.00461303] , loss: 0.250001
w: [[-0.00428325]
 [-0.00344257]] , b: [ 0.0045859] , loss: 0.250001
w: [[-0.0042566 ]
 [-0.00342381]] , b: [ 0.00455892] , loss: 0.25
w: [[-0.00423013]
 [-0.00340515]] , b: [ 0.0045321] , loss: 0.25
w: [[-0.00420382]
 [-0.00338659]] , b: [ 0.00450545] , loss: 0.25
w: [[-0.00417769]
 [-0.00336812]] , b: [ 0.00447895] , loss: 0.25
w: [[-0.00415172]
 [-0.00334974]] , b: [ 0.0044526] , loss: 0.25
w: [[-0.00412591]
 [-0.00333146]] , b: [ 0.00442642] , loss: 0.25
w: [[-0.00410027]
 [-0.00331327]] , b: [ 0.00440038] , loss: 0.25
w: [[-0.00407479]
 [-0.00329518]] , b: [ 0.0043745] , loss: 0.25
w: [[-0.00404948]
 [-0.00327718]] , b: [ 0.00434877] , loss: 0.25
w: [[-0.00402432]
 [-0.00325927]] , b: [ 0.00432319] , loss: 0.25
w: [[-0.00399933]
 [-0.00324145]] , b: [ 0.00429776] , loss: 0.25
w: [[-0.0039745 ]
 [-0.00322373]] , b: [ 0.00427248] , loss: 0.25
w: [[-0.00394982]
 [-0.0032061 ]] , b: [ 0.00424735] , loss: 0.25
w: [[-0.0039253 ]
 [-0.00318856]] , b: [ 0.00422236] , loss: 0.25
w: [[-0.00390094]
 [-0.0031711 ]] , b: [ 0.00419753] , loss: 0.25
w: [[-0.00387673]
 [-0.00315375]] , b: [ 0.00417284] , loss: 0.25
w: [[-0.00385268]
 [-0.00313648]] , b: [ 0.00414829] , loss: 0.25
w: [[-0.00382878]
 [-0.0031193 ]] , b: [ 0.00412389] , loss: 0.25
w: [[-0.00380503]
 [-0.00310221]] , b: [ 0.00409963] , loss: 0.25
w: [[-0.00378144]
 [-0.00308521]] , b: [ 0.00407552] , loss: 0.25
w: [[-0.003758  ]
 [-0.00306829]] , b: [ 0.00405155] , loss: 0.25
w: [[-0.0037347 ]
 [-0.00305147]] , b: [ 0.00402771] , loss: 0.25
w: [[-0.00371155]
 [-0.00303473]] , b: [ 0.00400402] , loss: 0.25
w: [[-0.00368855]
 [-0.00301807]] , b: [ 0.00398048] , loss: 0.25
w: [[-0.00366569]
 [-0.00300151]] , b: [ 0.00395706] , loss: 0.25
w: [[-0.00364299]
 [-0.00298503]] , b: [ 0.00393378] , loss: 0.25
w: [[-0.00362042]
 [-0.00296864]] , b: [ 0.00391065] , loss: 0.25
w: [[-0.003598  ]
 [-0.00295233]] , b: [ 0.00388764] , loss: 0.25
w: [[-0.00357572]
 [-0.00293611]] , b: [ 0.00386477] , loss: 0.25
w: [[-0.00355358]
 [-0.00291997]] , b: [ 0.00384203] , loss: 0.25
w: [[-0.00353158]
 [-0.00290391]] , b: [ 0.00381943] , loss: 0.25
w: [[-0.00350972]
 [-0.00288794]] , b: [ 0.00379696] , loss: 0.25
w: [[-0.00348801]
 [-0.00287206]] , b: [ 0.00377462] , loss: 0.25
w: [[-0.00346642]
 [-0.00285625]] , b: [ 0.00375242] , loss: 0.25
w: [[-0.00344498]
 [-0.00284054]] , b: [ 0.00373035] , loss: 0.25
w: [[-0.00342367]
 [-0.0028249 ]] , b: [ 0.00370841] , loss: 0.25
w: [[-0.00340249]
 [-0.00280934]] , b: [ 0.0036866] , loss: 0.25
w: [[-0.00338146]
 [-0.00279386]] , b: [ 0.00366492] , loss: 0.25
w: [[-0.00336055]
 [-0.00277847]] , b: [ 0.00364336] , loss: 0.25
w: [[-0.00333978]
 [-0.00276316]] , b: [ 0.00362192] , loss: 0.25
w: [[-0.00331914]
 [-0.00274792]] , b: [ 0.00360062] , loss: 0.25
w: [[-0.00329863]
 [-0.00273277]] , b: [ 0.00357944] , loss: 0.25
w: [[-0.00327824]
 [-0.00271769]] , b: [ 0.00355839] , loss: 0.25
w: [[-0.00325799]
 [-0.0027027 ]] , b: [ 0.00353745] , loss: 0.25
w: [[-0.00323787]
 [-0.00268778]] , b: [ 0.00351664] , loss: 0.25
w: [[-0.00321787]
 [-0.00267295]] , b: [ 0.00349595] , loss: 0.25
w: [[-0.003198  ]
 [-0.00265819]] , b: [ 0.00347539] , loss: 0.25
w: [[-0.00317826]
 [-0.00264351]] , b: [ 0.00345494] , loss: 0.25
w: [[-0.00315864]
 [-0.0026289 ]] , b: [ 0.00343462] , loss: 0.25
w: [[-0.00313914]
 [-0.00261438]] , b: [ 0.00341441] , loss: 0.25
w: [[-0.00311977]
 [-0.00259993]] , b: [ 0.00339432] , loss: 0.25
w: [[-0.00310052]
 [-0.00258556]] , b: [ 0.00337435] , loss: 0.25
w: [[-0.00308139]
 [-0.00257125]] , b: [ 0.0033545] , loss: 0.25
w: [[-0.00306238]
 [-0.00255703]] , b: [ 0.00333477] , loss: 0.25
w: [[-0.00304349]
 [-0.00254288]] , b: [ 0.00331516] , loss: 0.25
w: [[-0.00302472]
 [-0.0025288 ]] , b: [ 0.00329565] , loss: 0.25
w: [[-0.00300607]
 [-0.0025148 ]] , b: [ 0.00327626] , loss: 0.25
w: [[-0.00298754]
 [-0.00250088]] , b: [ 0.003257] , loss: 0.25
w: [[-0.00296912]
 [-0.00248702]] , b: [ 0.00323783] , loss: 0.25
w: [[-0.00295082]
 [-0.00247325]] , b: [ 0.00321879] , loss: 0.25
w: [[-0.00293264]
 [-0.00245954]] , b: [ 0.00319985] , loss: 0.25
w: [[-0.00291456]
 [-0.00244591]] , b: [ 0.00318103] , loss: 0.25
w: [[-0.0028966 ]
 [-0.00243235]] , b: [ 0.00316232] , loss: 0.25
w: [[-0.00287876]
 [-0.00241885]] , b: [ 0.00314372] , loss: 0.25
w: [[-0.00286103]
 [-0.00240544]] , b: [ 0.00312522] , loss: 0.25
w: [[-0.00284341]
 [-0.00239209]] , b: [ 0.00310684] , loss: 0.25
w: [[-0.00282591]
 [-0.00237882]] , b: [ 0.00308856] , loss: 0.25
w: [[-0.00280851]
 [-0.00236561]] , b: [ 0.00307039] , loss: 0.25
w: [[-0.00279122]
 [-0.00235248]] , b: [ 0.00305233] , loss: 0.25
w: [[-0.00277404]
 [-0.00233942]] , b: [ 0.00303437] , loss: 0.25
w: [[-0.00275697]
 [-0.00232642]] , b: [ 0.00301652] , loss: 0.25
w: [[-0.00274001]
 [-0.00231349]] , b: [ 0.00299878] , loss: 0.25
w: [[-0.00272315]
 [-0.00230064]] , b: [ 0.00298113] , loss: 0.25
w: [[-0.0027064 ]
 [-0.00228786]] , b: [ 0.00296359] , loss: 0.25
w: [[-0.00268976]
 [-0.00227514]] , b: [ 0.00294616] , loss: 0.25
w: [[-0.00267322]
 [-0.00226248]] , b: [ 0.00292883] , loss: 0.25
w: [[-0.00265678]
 [-0.0022499 ]] , b: [ 0.0029116] , loss: 0.25
w: [[-0.00264045]
 [-0.00223739]] , b: [ 0.00289446] , loss: 0.25
w: [[-0.00262422]
 [-0.00222494]] , b: [ 0.00287743] , loss: 0.25
w: [[-0.00260809]
 [-0.00221256]] , b: [ 0.0028605] , loss: 0.25
w: [[-0.00259206]
 [-0.00220023]] , b: [ 0.00284368] , loss: 0.25
w: [[-0.00257614]
 [-0.00218798]] , b: [ 0.00282695] , loss: 0.25
w: [[-0.00256031]
 [-0.0021758 ]] , b: [ 0.00281032] , loss: 0.25
w: [[-0.00254458]
 [-0.00216368]] , b: [ 0.00279378] , loss: 0.25
w: [[-0.00252895]
 [-0.00215162]] , b: [ 0.00277735] , loss: 0.25
w: [[-0.00251342]
 [-0.00213963]] , b: [ 0.00276101] , loss: 0.25
w: [[-0.00249798]
 [-0.0021277 ]] , b: [ 0.00274477] , loss: 0.25
w: [[-0.00248265]
 [-0.00211584]] , b: [ 0.00272861] , loss: 0.25
w: [[-0.00246741]
 [-0.00210404]] , b: [ 0.00271256] , loss: 0.25
w: [[-0.00245227]
 [-0.00209231]] , b: [ 0.0026966] , loss: 0.25
w: [[-0.00243722]
 [-0.00208063]] , b: [ 0.00268074] , loss: 0.25
w: [[-0.00242227]
 [-0.00206902]] , b: [ 0.00266497] , loss: 0.25
w: [[-0.0024074 ]
 [-0.00205747]] , b: [ 0.00264929] , loss: 0.25
w: [[-0.00239263]
 [-0.00204599]] , b: [ 0.00263371] , loss: 0.25
w: [[-0.00237796]
 [-0.00203456]] , b: [ 0.00261821] , loss: 0.25
w: [[-0.00236338]
 [-0.0020232 ]] , b: [ 0.00260281] , loss: 0.25
w: [[-0.00234889]
 [-0.00201191]] , b: [ 0.00258749] , loss: 0.25
w: [[-0.00233449]
 [-0.00200067]] , b: [ 0.00257227] , loss: 0.25
w: [[-0.00232018]
 [-0.00198949]] , b: [ 0.00255713] , loss: 0.25
w: [[-0.00230595]
 [-0.00197837]] , b: [ 0.00254209] , loss: 0.25
w: [[-0.00229182]
 [-0.00196731]] , b: [ 0.00252713] , loss: 0.25
w: [[-0.00227777]
 [-0.00195631]] , b: [ 0.00251227] , loss: 0.25
w: [[-0.00226381]
 [-0.00194536]] , b: [ 0.00249749] , loss: 0.25
w: [[-0.00224994]
 [-0.00193448]] , b: [ 0.00248281] , loss: 0.25
w: [[-0.00223616]
 [-0.00192365]] , b: [ 0.0024682] , loss: 0.25
w: [[-0.00222246]
 [-0.00191289]] , b: [ 0.00245368] , loss: 0.25
w: [[-0.00220884]
 [-0.00190218]] , b: [ 0.00243925] , loss: 0.25
w: [[-0.00219532]
 [-0.00189153]] , b: [ 0.0024249] , loss: 0.25
w: [[-0.00218188]
 [-0.00188094]] , b: [ 0.00241063] , loss: 0.25
w: [[-0.00216852]
 [-0.0018704 ]] , b: [ 0.00239645] , loss: 0.25
w: [[-0.00215525]
 [-0.00185992]] , b: [ 0.00238235] , loss: 0.25
w: [[-0.00214205]
 [-0.0018495 ]] , b: [ 0.00236834] , loss: 0.25
w: [[-0.00212895]
 [-0.00183913]] , b: [ 0.0023544] , loss: 0.25
w: [[-0.00211591]
 [-0.00182882]] , b: [ 0.00234055] , loss: 0.25
w: [[-0.00210297]
 [-0.00181857]] , b: [ 0.00232678] , loss: 0.25
w: [[-0.0020901 ]
 [-0.00180837]] , b: [ 0.0023131] , loss: 0.25
w: [[-0.00207732]
 [-0.00179823]] , b: [ 0.00229949] , loss: 0.25
w: [[-0.00206461]
 [-0.00178814]] , b: [ 0.00228596] , loss: 0.25
w: [[-0.00205199]
 [-0.00177811]] , b: [ 0.00227251] , loss: 0.25
w: [[-0.00203944]
 [-0.00176813]] , b: [ 0.00225913] , loss: 0.25
w: [[-0.00202697]
 [-0.0017582 ]] , b: [ 0.00224584] , loss: 0.25
w: [[-0.00201458]
 [-0.00174833]] , b: [ 0.00223264] , loss: 0.25
w: [[-0.00200226]
 [-0.00173851]] , b: [ 0.00221951] , loss: 0.25
w: [[-0.00199002]
 [-0.00172875]] , b: [ 0.00220645] , loss: 0.25
w: [[-0.00197786]
 [-0.00171904]] , b: [ 0.00219347] , loss: 0.25
w: [[-0.00196578]
 [-0.00170938]] , b: [ 0.00218057] , loss: 0.25
w: [[-0.00195377]
 [-0.00169978]] , b: [ 0.00216774] , loss: 0.25
w: [[-0.00194184]
 [-0.00169023]] , b: [ 0.00215498] , loss: 0.25
w: [[-0.00192998]
 [-0.00168073]] , b: [ 0.0021423] , loss: 0.25
w: [[-0.00191819]
 [-0.00167128]] , b: [ 0.0021297] , loss: 0.25
w: [[-0.00190647]
 [-0.00166188]] , b: [ 0.00211717] , loss: 0.25
w: [[-0.00189483]
 [-0.00165254]] , b: [ 0.00210471] , loss: 0.25
w: [[-0.00188326]
 [-0.00164324]] , b: [ 0.00209233] , loss: 0.25
w: [[-0.00187176]
 [-0.00163399]] , b: [ 0.00208002] , loss: 0.25
w: [[-0.00186034]
 [-0.00162481]] , b: [ 0.00206778] , loss: 0.25
w: [[-0.00184899]
 [-0.00161566]] , b: [ 0.00205561] , loss: 0.25
w: [[-0.00183771]
 [-0.00160657]] , b: [ 0.00204352] , loss: 0.25
w: [[-0.00182649]
 [-0.00159752]] , b: [ 0.00203149] , loss: 0.25
w: [[-0.00181535]
 [-0.00158853]] , b: [ 0.00201954] , loss: 0.25
w: [[-0.00180428]
 [-0.00157959]] , b: [ 0.00200765] , loss: 0.25
w: [[-0.00179328]
 [-0.00157069]] , b: [ 0.00199584] , loss: 0.25
w: [[-0.00178235]
 [-0.00156185]] , b: [ 0.00198409] , loss: 0.25
w: [[-0.00177148]
 [-0.00155305]] , b: [ 0.00197241] , loss: 0.25
w: [[-0.00176068]
 [-0.0015443 ]] , b: [ 0.0019608] , loss: 0.25
w: [[-0.00174994]
 [-0.00153559]] , b: [ 0.00194927] , loss: 0.25
w: [[-0.00173928]
 [-0.00152694]] , b: [ 0.0019378] , loss: 0.25
w: [[-0.00172867]
 [-0.00151833]] , b: [ 0.0019264] , loss: 0.25
w: [[-0.00171814]
 [-0.00150977]] , b: [ 0.00191506] , loss: 0.25
w: [[-0.00170767]
 [-0.00150125]] , b: [ 0.0019038] , loss: 0.25
w: [[-0.00169726]
 [-0.00149278]] , b: [ 0.0018926] , loss: 0.25
w: [[-0.00168692]
 [-0.00148435]] , b: [ 0.00188147] , loss: 0.25
w: [[-0.00167664]
 [-0.00147597]] , b: [ 0.00187041] , loss: 0.25
w: [[-0.00166642]
 [-0.00146764]] , b: [ 0.0018594] , loss: 0.25
w: [[-0.00165627]
 [-0.00145935]] , b: [ 0.00184846] , loss: 0.25
w: [[-0.00164619]
 [-0.00145111]] , b: [ 0.00183759] , loss: 0.25
w: [[-0.00163616]
 [-0.00144292]] , b: [ 0.00182678] , loss: 0.25
w: [[-0.0016262 ]
 [-0.00143477]] , b: [ 0.00181603] , loss: 0.25
w: [[-0.0016163 ]
 [-0.00142667]] , b: [ 0.00180534] , loss: 0.25
w: [[-0.00160647]
 [-0.00141861]] , b: [ 0.00179472] , loss: 0.25
w: [[-0.00159668]
 [-0.00141059]] , b: [ 0.00178416] , loss: 0.25
w: [[-0.00158697]
 [-0.00140262]] , b: [ 0.00177366] , loss: 0.25
w: [[-0.00157731]
 [-0.00139469]] , b: [ 0.00176322] , loss: 0.25
w: [[-0.00156771]
 [-0.00138681]] , b: [ 0.00175285] , loss: 0.25
w: [[-0.00155818]
 [-0.00137897]] , b: [ 0.00174253] , loss: 0.25
w: [[-0.0015487 ]
 [-0.00137117]] , b: [ 0.00173229] , loss: 0.25
w: [[-0.00153928]
 [-0.00136341]] , b: [ 0.00172209] , loss: 0.25
w: [[-0.00152992]
 [-0.00135571]] , b: [ 0.00171196] , loss: 0.25
w: [[-0.00152062]
 [-0.00134804]] , b: [ 0.00170189] , loss: 0.25
w: [[-0.00151137]
 [-0.00134041]] , b: [ 0.00169187] , loss: 0.25
w: [[-0.00150219]
 [-0.00133283]] , b: [ 0.00168192] , loss: 0.25
w: [[-0.00149306]
 [-0.00132529]] , b: [ 0.00167203] , loss: 0.25
w: [[-0.00148398]
 [-0.00131779]] , b: [ 0.00166219] , loss: 0.25
w: [[-0.00147496]
 [-0.00131033]] , b: [ 0.00165241] , loss: 0.25
w: [[-0.001466  ]
 [-0.00130291]] , b: [ 0.00164268] , loss: 0.25
w: [[-0.0014571 ]
 [-0.00129553]] , b: [ 0.00163301] , loss: 0.25
w: [[-0.00144825]
 [-0.0012882 ]] , b: [ 0.0016234] , loss: 0.25
w: [[-0.00143945]
 [-0.0012809 ]] , b: [ 0.00161385] , loss: 0.25
w: [[-0.0014307 ]
 [-0.00127364]] , b: [ 0.00160436] , loss: 0.25
w: [[-0.00142201]
 [-0.00126643]] , b: [ 0.00159492] , loss: 0.25
w: [[-0.00141337]
 [-0.00125925]] , b: [ 0.00158554] , loss: 0.25
w: [[-0.00140479]
 [-0.00125211]] , b: [ 0.00157621] , loss: 0.25
w: [[-0.00139626]
 [-0.00124501]] , b: [ 0.00156693] , loss: 0.25
w: [[-0.00138779]
 [-0.00123796]] , b: [ 0.00155771] , loss: 0.25
w: [[-0.00137936]
 [-0.00123094]] , b: [ 0.00154854] , loss: 0.25
w: [[-0.00137098]
 [-0.00122396]] , b: [ 0.00153944] , loss: 0.25
w: [[-0.00136266]
 [-0.00121701]] , b: [ 0.00153038] , loss: 0.25
w: [[-0.00135438]
 [-0.00121011]] , b: [ 0.00152138] , loss: 0.25
w: [[-0.00134616]
 [-0.00120324]] , b: [ 0.00151243] , loss: 0.25
w: [[-0.00133799]
 [-0.00119641]] , b: [ 0.00150353] , loss: 0.25

Inspect the trained model parameters and the model outputs. What is the minimum found by the optimizer?


In [18]:
print(sess.run(y_model,{x:x_train}))


[[ 0.50037587]
 [ 0.50007677]
 [ 0.50004137]
 [ 0.49974227]]

Multilayer Perceptron

A multilayer perceptron is a feedforward network that can be thought of a model composed of multiple nested functions, for instance:

$$y = f^{(3)}(f^{(2)}(f^{(1)}(x)))$$

This means that the output of each function is routed as the input of the next function, and this operational and data flow is strictly one-directional (thus "feedforward") and may contain multiple layers of nested functions (thus "deep"). TensorFlow is a very suitable tool for building and training such models. Here we will consider the XOR problem once again, and build a multilayer perceptron to classify the data correctly.

It was demonstrated previously that the XOR data are not linearly separable - this means that a non-linear layer (function) within the model is needed to tranform the problem to a linearly separable space. This is in fact the core of the multilayer perceptron as well as other deep learning models - nonlinear activation functions such as the logistic function, $tanh$, or ReLU. A comprehensive guide for TensorFlow supported functions can be found in: https://www.tensorflow.org/versions/r0.12/api_docs/python/nn/activation_functions_.

Let us build a multilayer perceptron model where the sigmoid activation function is used for the hiddern layer. Let:

  • $f^{(1)}(x) = W^{(1)}x + b^{(1)}$
  • $f^{(2)}(x) = {1}/({1+e^{-x}})$
  • $f^{(3)}(x) = W^{(2)}x + b^{(2)}$

with $W^{(1)} \in \mathbb{R}^{2\times 2}$, $b^{(1)} \in \mathbb{R}^{2\times 1}$, $W^{(2)} \in \mathbb{R}^{2\times 1}$, and $b^{(2)} \in \mathbb{R}$.


In [41]:
sess = tf.Session()

x_train = np.array([[0,0],[0,1],[1,0],[1,1]])
y_train = np.array([[0],[1],[1],[0]])

X = tf.placeholder(tf.float32,[None,2])
y = tf.placeholder(tf.float32,[None,1])

W1 = tf.Variable(tf.random_uniform([2,2]),name="weights1")
b1 = tf.Variable(tf.random_uniform([2]), name="bias1")

W2 = tf.Variable(tf.random_uniform([2,1]),name="weights2")
b2 = tf.Variable(tf.random_uniform([1]), name="bias2")

f1 = tf.matmul(X,W1)+b1
f2 = tf.nn.sigmoid(f1)
y_model = tf.matmul(f2,W2)+b2

loss = tf.reduce_mean(tf.square(y_model-y))

optimizer = tf.train.GradientDescentOptimizer(0.35)
#optimizer = tf.train.AdamOptimizer(0.1)
train = optimizer.minimize(loss)

sess.run(tf.global_variables_initializer())

for epoch in range(1000):
    sess.run(train, feed_dict={X: x_train, y: y_train})
    print("loss:", sess.run(loss,{X: x_train, y: y_train}))


loss: 0.750267
loss: 0.313003
loss: 0.258538
loss: 0.251444
loss: 0.250557
loss: 0.250435
loss: 0.250409
loss: 0.250394
loss: 0.250381
loss: 0.250369
loss: 0.250357
loss: 0.250345
loss: 0.250333
loss: 0.250321
loss: 0.250309
loss: 0.250297
loss: 0.250286
loss: 0.250274
loss: 0.250263
loss: 0.250252
loss: 0.250241
loss: 0.25023
loss: 0.250219
loss: 0.250208
loss: 0.250197
loss: 0.250187
loss: 0.250176
loss: 0.250166
loss: 0.250155
loss: 0.250145
loss: 0.250135
loss: 0.250125
loss: 0.250115
loss: 0.250105
loss: 0.250095
loss: 0.250085
loss: 0.250076
loss: 0.250066
loss: 0.250056
loss: 0.250047
loss: 0.250038
loss: 0.250028
loss: 0.250019
loss: 0.25001
loss: 0.250001
loss: 0.249992
loss: 0.249983
loss: 0.249974
loss: 0.249965
loss: 0.249956
loss: 0.249947
loss: 0.249938
loss: 0.24993
loss: 0.249921
loss: 0.249913
loss: 0.249904
loss: 0.249896
loss: 0.249887
loss: 0.249879
loss: 0.249871
loss: 0.249862
loss: 0.249854
loss: 0.249846
loss: 0.249838
loss: 0.24983
loss: 0.249822
loss: 0.249813
loss: 0.249805
loss: 0.249798
loss: 0.24979
loss: 0.249782
loss: 0.249774
loss: 0.249766
loss: 0.249758
loss: 0.249751
loss: 0.249743
loss: 0.249735
loss: 0.249728
loss: 0.24972
loss: 0.249712
loss: 0.249705
loss: 0.249697
loss: 0.24969
loss: 0.249682
loss: 0.249675
loss: 0.249667
loss: 0.24966
loss: 0.249653
loss: 0.249645
loss: 0.249638
loss: 0.24963
loss: 0.249623
loss: 0.249616
loss: 0.249609
loss: 0.249601
loss: 0.249594
loss: 0.249587
loss: 0.24958
loss: 0.249572
loss: 0.249565
loss: 0.249558
loss: 0.249551
loss: 0.249544
loss: 0.249537
loss: 0.249529
loss: 0.249522
loss: 0.249515
loss: 0.249508
loss: 0.249501
loss: 0.249494
loss: 0.249487
loss: 0.24948
loss: 0.249473
loss: 0.249466
loss: 0.249459
loss: 0.249452
loss: 0.249445
loss: 0.249438
loss: 0.24943
loss: 0.249423
loss: 0.249416
loss: 0.249409
loss: 0.249402
loss: 0.249395
loss: 0.249388
loss: 0.249381
loss: 0.249374
loss: 0.249367
loss: 0.24936
loss: 0.249353
loss: 0.249346
loss: 0.249339
loss: 0.249332
loss: 0.249325
loss: 0.249318
loss: 0.249311
loss: 0.249304
loss: 0.249296
loss: 0.249289
loss: 0.249282
loss: 0.249275
loss: 0.249268
loss: 0.249261
loss: 0.249254
loss: 0.249247
loss: 0.249239
loss: 0.249232
loss: 0.249225
loss: 0.249218
loss: 0.249211
loss: 0.249203
loss: 0.249196
loss: 0.249189
loss: 0.249181
loss: 0.249174
loss: 0.249167
loss: 0.24916
loss: 0.249152
loss: 0.249145
loss: 0.249137
loss: 0.24913
loss: 0.249123
loss: 0.249115
loss: 0.249108
loss: 0.2491
loss: 0.249093
loss: 0.249085
loss: 0.249077
loss: 0.24907
loss: 0.249062
loss: 0.249055
loss: 0.249047
loss: 0.249039
loss: 0.249032
loss: 0.249024
loss: 0.249016
loss: 0.249008
loss: 0.249
loss: 0.248993
loss: 0.248985
loss: 0.248977
loss: 0.248969
loss: 0.248961
loss: 0.248953
loss: 0.248945
loss: 0.248937
loss: 0.248928
loss: 0.24892
loss: 0.248912
loss: 0.248904
loss: 0.248896
loss: 0.248887
loss: 0.248879
loss: 0.248871
loss: 0.248862
loss: 0.248854
loss: 0.248845
loss: 0.248837
loss: 0.248828
loss: 0.248819
loss: 0.248811
loss: 0.248802
loss: 0.248793
loss: 0.248784
loss: 0.248776
loss: 0.248767
loss: 0.248758
loss: 0.248749
loss: 0.24874
loss: 0.248731
loss: 0.248722
loss: 0.248712
loss: 0.248703
loss: 0.248694
loss: 0.248684
loss: 0.248675
loss: 0.248666
loss: 0.248656
loss: 0.248646
loss: 0.248637
loss: 0.248627
loss: 0.248617
loss: 0.248608
loss: 0.248598
loss: 0.248588
loss: 0.248578
loss: 0.248568
loss: 0.248558
loss: 0.248547
loss: 0.248537
loss: 0.248527
loss: 0.248517
loss: 0.248506
loss: 0.248496
loss: 0.248485
loss: 0.248474
loss: 0.248464
loss: 0.248453
loss: 0.248442
loss: 0.248431
loss: 0.24842
loss: 0.248409
loss: 0.248398
loss: 0.248386
loss: 0.248375
loss: 0.248364
loss: 0.248352
loss: 0.248341
loss: 0.248329
loss: 0.248317
loss: 0.248305
loss: 0.248293
loss: 0.248281
loss: 0.248269
loss: 0.248257
loss: 0.248245
loss: 0.248233
loss: 0.24822
loss: 0.248208
loss: 0.248195
loss: 0.248182
loss: 0.248169
loss: 0.248156
loss: 0.248143
loss: 0.24813
loss: 0.248117
loss: 0.248103
loss: 0.24809
loss: 0.248076
loss: 0.248063
loss: 0.248049
loss: 0.248035
loss: 0.248021
loss: 0.248007
loss: 0.247993
loss: 0.247978
loss: 0.247964
loss: 0.247949
loss: 0.247934
loss: 0.24792
loss: 0.247905
loss: 0.247889
loss: 0.247874
loss: 0.247859
loss: 0.247843
loss: 0.247828
loss: 0.247812
loss: 0.247796
loss: 0.24778
loss: 0.247764
loss: 0.247747
loss: 0.247731
loss: 0.247714
loss: 0.247698
loss: 0.247681
loss: 0.247664
loss: 0.247646
loss: 0.247629
loss: 0.247612
loss: 0.247594
loss: 0.247576
loss: 0.247558
loss: 0.24754
loss: 0.247522
loss: 0.247503
loss: 0.247484
loss: 0.247466
loss: 0.247447
loss: 0.247427
loss: 0.247408
loss: 0.247388
loss: 0.247369
loss: 0.247349
loss: 0.247329
loss: 0.247308
loss: 0.247288
loss: 0.247267
loss: 0.247246
loss: 0.247225
loss: 0.247204
loss: 0.247182
loss: 0.247161
loss: 0.247139
loss: 0.247117
loss: 0.247094
loss: 0.247072
loss: 0.247049
loss: 0.247026
loss: 0.247003
loss: 0.246979
loss: 0.246956
loss: 0.246932
loss: 0.246907
loss: 0.246883
loss: 0.246858
loss: 0.246833
loss: 0.246808
loss: 0.246783
loss: 0.246757
loss: 0.246731
loss: 0.246705
loss: 0.246679
loss: 0.246652
loss: 0.246625
loss: 0.246598
loss: 0.24657
loss: 0.246542
loss: 0.246514
loss: 0.246486
loss: 0.246457
loss: 0.246428
loss: 0.246399
loss: 0.246369
loss: 0.246339
loss: 0.246309
loss: 0.246278
loss: 0.246248
loss: 0.246216
loss: 0.246185
loss: 0.246153
loss: 0.246121
loss: 0.246088
loss: 0.246055
loss: 0.246022
loss: 0.245989
loss: 0.245955
loss: 0.24592
loss: 0.245886
loss: 0.245851
loss: 0.245815
loss: 0.24578
loss: 0.245743
loss: 0.245707
loss: 0.24567
loss: 0.245633
loss: 0.245595
loss: 0.245557
loss: 0.245518
loss: 0.245479
loss: 0.24544
loss: 0.2454
loss: 0.245359
loss: 0.245319
loss: 0.245278
loss: 0.245236
loss: 0.245194
loss: 0.245151
loss: 0.245108
loss: 0.245065
loss: 0.245021
loss: 0.244977
loss: 0.244932
loss: 0.244886
loss: 0.24484
loss: 0.244794
loss: 0.244747
loss: 0.244699
loss: 0.244651
loss: 0.244603
loss: 0.244554
loss: 0.244504
loss: 0.244454
loss: 0.244403
loss: 0.244352
loss: 0.2443
loss: 0.244248
loss: 0.244195
loss: 0.244141
loss: 0.244087
loss: 0.244032
loss: 0.243977
loss: 0.243921
loss: 0.243864
loss: 0.243807
loss: 0.243749
loss: 0.243691
loss: 0.243631
loss: 0.243571
loss: 0.243511
loss: 0.24345
loss: 0.243388
loss: 0.243325
loss: 0.243262
loss: 0.243198
loss: 0.243133
loss: 0.243068
loss: 0.243002
loss: 0.242935
loss: 0.242867
loss: 0.242799
loss: 0.242729
loss: 0.242659
loss: 0.242589
loss: 0.242517
loss: 0.242445
loss: 0.242372
loss: 0.242298
loss: 0.242223
loss: 0.242147
loss: 0.242071
loss: 0.241993
loss: 0.241915
loss: 0.241836
loss: 0.241756
loss: 0.241675
loss: 0.241594
loss: 0.241511
loss: 0.241427
loss: 0.241343
loss: 0.241258
loss: 0.241171
loss: 0.241084
loss: 0.240996
loss: 0.240906
loss: 0.240816
loss: 0.240725
loss: 0.240633
loss: 0.240539
loss: 0.240445
loss: 0.24035
loss: 0.240253
loss: 0.240156
loss: 0.240057
loss: 0.239958
loss: 0.239857
loss: 0.239755
loss: 0.239652
loss: 0.239549
loss: 0.239443
loss: 0.239337
loss: 0.23923
loss: 0.239121
loss: 0.239011
loss: 0.2389
loss: 0.238788
loss: 0.238675
loss: 0.238561
loss: 0.238445
loss: 0.238328
loss: 0.238209
loss: 0.23809
loss: 0.237969
loss: 0.237847
loss: 0.237724
loss: 0.237599
loss: 0.237473
loss: 0.237346
loss: 0.237217
loss: 0.237087
loss: 0.236956
loss: 0.236823
loss: 0.236689
loss: 0.236553
loss: 0.236416
loss: 0.236278
loss: 0.236138
loss: 0.235997
loss: 0.235854
loss: 0.23571
loss: 0.235565
loss: 0.235418
loss: 0.235269
loss: 0.235119
loss: 0.234967
loss: 0.234814
loss: 0.23466
loss: 0.234503
loss: 0.234345
loss: 0.234186
loss: 0.234025
loss: 0.233863
loss: 0.233698
loss: 0.233533
loss: 0.233365
loss: 0.233196
loss: 0.233025
loss: 0.232853
loss: 0.232679
loss: 0.232503
loss: 0.232326
loss: 0.232147
loss: 0.231966
loss: 0.231783
loss: 0.231599
loss: 0.231413
loss: 0.231225
loss: 0.231035
loss: 0.230844
loss: 0.23065
loss: 0.230455
loss: 0.230258
loss: 0.23006
loss: 0.229859
loss: 0.229657
loss: 0.229453
loss: 0.229247
loss: 0.229039
loss: 0.228829
loss: 0.228617
loss: 0.228403
loss: 0.228188
loss: 0.22797
loss: 0.227751
loss: 0.22753
loss: 0.227306
loss: 0.227081
loss: 0.226854
loss: 0.226624
loss: 0.226393
loss: 0.22616
loss: 0.225925
loss: 0.225687
loss: 0.225448
loss: 0.225207
loss: 0.224964
loss: 0.224718
loss: 0.224471
loss: 0.224221
loss: 0.223969
loss: 0.223716
loss: 0.22346
loss: 0.223202
loss: 0.222942
loss: 0.22268
loss: 0.222416
loss: 0.222149
loss: 0.221881
loss: 0.22161
loss: 0.221337
loss: 0.221062
loss: 0.220785
loss: 0.220506
loss: 0.220224
loss: 0.219941
loss: 0.219655
loss: 0.219367
loss: 0.219076
loss: 0.218784
loss: 0.218489
loss: 0.218192
loss: 0.217893
loss: 0.217591
loss: 0.217288
loss: 0.216982
loss: 0.216673
loss: 0.216363
loss: 0.21605
loss: 0.215735
loss: 0.215418
loss: 0.215098
loss: 0.214776
loss: 0.214452
loss: 0.214125
loss: 0.213796
loss: 0.213465
loss: 0.213131
loss: 0.212795
loss: 0.212457
loss: 0.212116
loss: 0.211773
loss: 0.211428
loss: 0.21108
loss: 0.21073
loss: 0.210377
loss: 0.210022
loss: 0.209665
loss: 0.209305
loss: 0.208943
loss: 0.208578
loss: 0.208211
loss: 0.207841
loss: 0.207469
loss: 0.207095
loss: 0.206718
loss: 0.206338
loss: 0.205957
loss: 0.205572
loss: 0.205185
loss: 0.204796
loss: 0.204404
loss: 0.204009
loss: 0.203612
loss: 0.203213
loss: 0.20281
loss: 0.202406
loss: 0.201998
loss: 0.201588
loss: 0.201176
loss: 0.200761
loss: 0.200343
loss: 0.199922
loss: 0.199499
loss: 0.199073
loss: 0.198645
loss: 0.198214
loss: 0.19778
loss: 0.197343
loss: 0.196904
loss: 0.196462
loss: 0.196017
loss: 0.195569
loss: 0.195119
loss: 0.194666
loss: 0.19421
loss: 0.193751
loss: 0.193289
loss: 0.192824
loss: 0.192357
loss: 0.191886
loss: 0.191413
loss: 0.190936
loss: 0.190457
loss: 0.189975
loss: 0.189489
loss: 0.189001
loss: 0.18851
loss: 0.188015
loss: 0.187518
loss: 0.187017
loss: 0.186513
loss: 0.186006
loss: 0.185496
loss: 0.184983
loss: 0.184466
loss: 0.183946
loss: 0.183423
loss: 0.182896
loss: 0.182367
loss: 0.181833
loss: 0.181297
loss: 0.180757
loss: 0.180214
loss: 0.179667
loss: 0.179116
loss: 0.178563
loss: 0.178005
loss: 0.177444
loss: 0.17688
loss: 0.176311
loss: 0.17574
loss: 0.175164
loss: 0.174585
loss: 0.174002
loss: 0.173415
loss: 0.172824
loss: 0.17223
loss: 0.171632
loss: 0.171029
loss: 0.170423
loss: 0.169813
loss: 0.169199
loss: 0.168581
loss: 0.167959
loss: 0.167333
loss: 0.166702
loss: 0.166068
loss: 0.165429
loss: 0.164786
loss: 0.164139
loss: 0.163487
loss: 0.162832
loss: 0.162171
loss: 0.161507
loss: 0.160838
loss: 0.160165
loss: 0.159487
loss: 0.158805
loss: 0.158118
loss: 0.157427
loss: 0.156731
loss: 0.156031
loss: 0.155326
loss: 0.154616
loss: 0.153902
loss: 0.153183
loss: 0.152459
loss: 0.151731
loss: 0.150998
loss: 0.15026
loss: 0.149517
loss: 0.14877
loss: 0.148018
loss: 0.147261
loss: 0.146499
loss: 0.145732
loss: 0.14496
loss: 0.144184
loss: 0.143402
loss: 0.142616
loss: 0.141825
loss: 0.141029
loss: 0.140228
loss: 0.139422
loss: 0.138612
loss: 0.137796
loss: 0.136976
loss: 0.136151
loss: 0.13532
loss: 0.134486
loss: 0.133646
loss: 0.132801
loss: 0.131952
loss: 0.131098
loss: 0.130239
loss: 0.129375
loss: 0.128507
loss: 0.127634
loss: 0.126757
loss: 0.125875
loss: 0.124988
loss: 0.124097
loss: 0.123202
loss: 0.122302
loss: 0.121398
loss: 0.120489
loss: 0.119576
loss: 0.11866
loss: 0.117739
loss: 0.116814
loss: 0.115885
loss: 0.114952
loss: 0.114015
loss: 0.113075
loss: 0.112131
loss: 0.111183
loss: 0.110233
loss: 0.109278
loss: 0.10832
loss: 0.10736
loss: 0.106396
loss: 0.105429
loss: 0.104459
loss: 0.103487
loss: 0.102512
loss: 0.101534
loss: 0.100554
loss: 0.0995717
loss: 0.0985873
loss: 0.0976008
loss: 0.0966126
loss: 0.0956226
loss: 0.0946311
loss: 0.093638
loss: 0.0926437
loss: 0.0916482
loss: 0.0906518
loss: 0.0896546
loss: 0.0886566
loss: 0.0876583
loss: 0.0866595
loss: 0.0856605
loss: 0.0846615
loss: 0.0836627
loss: 0.0826644
loss: 0.0816664
loss: 0.0806692
loss: 0.0796728
loss: 0.0786775
loss: 0.0776835
loss: 0.0766909
loss: 0.0756998
loss: 0.0747107
loss: 0.0737235
loss: 0.0727384
loss: 0.0717558
loss: 0.0707757
loss: 0.0697984
loss: 0.068824
loss: 0.0678528
loss: 0.066885
loss: 0.0659206
loss: 0.06496
loss: 0.0640032
loss: 0.0630506
loss: 0.0621023
loss: 0.0611585
loss: 0.0602192
loss: 0.0592849
loss: 0.0583557
loss: 0.0574316
loss: 0.0565129
loss: 0.0555998
loss: 0.0546925
loss: 0.0537911
loss: 0.0528958
loss: 0.0520067
loss: 0.0511242
loss: 0.0502481
loss: 0.0493789
loss: 0.0485165
loss: 0.0476611
loss: 0.046813
loss: 0.0459723
loss: 0.0451389
loss: 0.0443132
loss: 0.0434953
loss: 0.0426852
loss: 0.0418831
loss: 0.0410891
loss: 0.0403034
loss: 0.0395259
loss: 0.038757
loss: 0.0379965
loss: 0.0372446
loss: 0.0365014
loss: 0.035767
loss: 0.0350415
loss: 0.0343248
loss: 0.0336173
loss: 0.0329187
loss: 0.0322292
loss: 0.0315489
loss: 0.0308779
loss: 0.0302159
loss: 0.0295634
loss: 0.0289201
loss: 0.0282861
loss: 0.0276616
loss: 0.0270463
loss: 0.0264403
loss: 0.0258438
loss: 0.0252566
loss: 0.0246787
loss: 0.0241102
loss: 0.023551
loss: 0.0230012
loss: 0.0224605
loss: 0.0219292
loss: 0.021407
loss: 0.020894
loss: 0.0203901
loss: 0.0198953
loss: 0.0194096
loss: 0.0189328
loss: 0.0184649
loss: 0.0180058
loss: 0.0175556
loss: 0.017114
loss: 0.0166811
loss: 0.0162567
loss: 0.0158408
loss: 0.0154334
loss: 0.0150342
loss: 0.0146434
loss: 0.0142606
loss: 0.0138859
loss: 0.0135191
loss: 0.0131603
loss: 0.0128092
loss: 0.0124657
loss: 0.0121299
loss: 0.0118015
loss: 0.0114805
loss: 0.0111668
loss: 0.0108602
loss: 0.0105607
loss: 0.0102681
loss: 0.00998243
loss: 0.00970345
loss: 0.00943111
loss: 0.00916528
loss: 0.00890592
loss: 0.00865278
loss: 0.00840588
loss: 0.00816505
loss: 0.00793023
loss: 0.00770123
loss: 0.00747802
loss: 0.00726047
loss: 0.00704846
loss: 0.0068419
loss: 0.00664064
loss: 0.00644463
loss: 0.00625373
loss: 0.00606785
loss: 0.00588689
loss: 0.00571076
loss: 0.00553929
loss: 0.00537248
loss: 0.00521013
loss: 0.00505225
loss: 0.00489863
loss: 0.00474926
loss: 0.00460402
loss: 0.00446279
loss: 0.00432549
loss: 0.00419203
loss: 0.00406235
loss: 0.00393632
loss: 0.00381385
loss: 0.00369489
loss: 0.00357934
loss: 0.00346714
loss: 0.00335814
loss: 0.00325233
loss: 0.00314957
loss: 0.00304985
loss: 0.00295303
loss: 0.00285908
loss: 0.00276791
loss: 0.00267943
loss: 0.00259359
loss: 0.00251033
loss: 0.00242955
loss: 0.00235123
loss: 0.00227525
loss: 0.00220159
loss: 0.00213017
loss: 0.00206093
loss: 0.00199381
loss: 0.00192874
loss: 0.00186568
loss: 0.00180458
loss: 0.00174536
loss: 0.00168799
loss: 0.0016324
loss: 0.00157856
loss: 0.00152639
loss: 0.00147588
loss: 0.00142694
loss: 0.00137955
loss: 0.00133365
loss: 0.00128923
loss: 0.00124622
loss: 0.00120457
loss: 0.00116427
loss: 0.00112524
loss: 0.00108748
loss: 0.00105093
loss: 0.00101556
loss: 0.000981326
loss: 0.000948217
loss: 0.000916171
loss: 0.000885175
loss: 0.000855178
loss: 0.000826171
loss: 0.000798119
loss: 0.000770982
loss: 0.000744731
loss: 0.000719352
loss: 0.000694797
loss: 0.000671066
loss: 0.000648114
loss: 0.000625925
loss: 0.000604468
loss: 0.000583735
loss: 0.000563687
loss: 0.00054431
loss: 0.000525578
loss: 0.000507473
loss: 0.000489975
loss: 0.000473061
loss: 0.000456723
loss: 0.000440935
loss: 0.000425673
loss: 0.000410926
loss: 0.000396678
loss: 0.000382916
loss: 0.000369618
loss: 0.000356776

The first layer $f^{(1)}(x) = W^{(1)}x + b^{(1)}$ is a linear transformation of the input, and thus cannot transform the XOR problem to a linearly separable space. Let us inspect the trained parameters $W^{(1)}$ and $b^{(1)}$, and the output of the first layer.


In [20]:
print(sess.run(W1),'\n')
print(sess.run(b1),'\n')

f1_out = sess.run(f1,{X: x_train, y: y_train})
print(f1_out,'\n')

plt.scatter(f1_out[t,0],f1_out[t,1],c='b',marker='x',s=70)
plt.scatter(f1_out[f,0],f1_out[f,1],c='r',marker='o',s=70)


[[ 1.45533264  3.05270791]
 [ 1.44410062  3.06721687]] 

[ 0.67456263  0.49640512] 

[[ 0.67456263  0.49640512]
 [ 2.11866331  3.563622  ]
 [ 2.12989521  3.54911304]
 [ 3.57399583  6.61632967]] 

Out[20]:
<matplotlib.collections.PathCollection at 0x1218e3e80>

The next layer $f^{(2)}(x)$ is the sigmoid function, which is a nonlinear transformation of the input, thus providing the possibility of transforming the problem to a new space where the outputs could be linearly separable.


In [21]:
f2_out = sess.run(f2,{X: x_train, y: y_train})
print(f2_out)

plt.scatter(f2_out[t,0],f2_out[t,1],c='b',marker='x',s=70)
plt.scatter(f2_out[f,0],f2_out[f,1],c='r',marker='o',s=70)


[[ 0.66252404  0.6216141 ]
 [ 0.89270395  0.97244477]
 [ 0.89377505  0.97205329]
 [ 0.97272134  0.99866343]]
Out[21]:
<matplotlib.collections.PathCollection at 0x1218a1b38>

The final layer is the model output:


In [22]:
print("y: ",sess.run(y,{X: x_train, y: y_train}),"\n")
print("model: ",sess.run(y_model,{X: x_train, y: y_train}))


y:  [[ 0.]
 [ 1.]
 [ 1.]
 [ 0.]] 

model:  [[ 0.08845341]
 [ 0.63328826]
 [ 0.63199496]
 [ 0.64158225]]

The network seems to have learned to classify the XOR problem correctly, thanks to the multi-layered structure and the non-linear activation function in the hidden layer. This example embodies the some of the primary reasons for employing deep learning models, especially for highly non-linear problems where traditional linear approaches fail.


In [ ]: